It'd be really cool if we can add more social dynamic features to papers / research, and have people correct papers in the open, like twitter community notes, to the point that if you can really disprove it, the paper is now marked "wrong" or something like that on arxiv. Sort of inbetween collaborative science or hunting for patent prior art or bug bounty hunting. Getting paid for it would be even better, making money by sniping issues in research. One can dream.
I wonder how much money in bounties it'd take to proof-invalidate whole branches of psychology (insert your own pet peeve here) research in a methodical way, probably not that much.
I wish academic papers were more like Wikipedia articles. Currently what I'm working on is really "Building on" one pretty pivotal paper from the 90s, and there's a whole constellation of work that has spawned.
So much ink is spilled re-defining the problem, and reading any paper requires going through the system model every time because tons of arbitrary decisions may have been done different. It makes it hard to compare results, and makes almost every statement that reads "Over in this area we're not innovating on, we used the SOTA" wrong, because some other group is innovating in that corner.
If instead there was one canonical version of it with an edit history, and I could go try to just re-write one little para and argue in the talk section about it with the one-or-two other groups picking away at that, I feel like things could move faster and be done higher quality.
It'd also be a lot easier to peek at other areas. Currently if I have a question like "What's the latest in NeRFs underwater? I remember seeing a paper about that a while ago" I've basically got no idea.
> I wish academic papers were more like Wikipedia articles.
I don't think that would be helpful. Scientific development happens in branches, not linearly. The fact that a field is going in one direction does not mean that somebody won't make a breakthrough next year based on a poorly-cited paper from the 1970s, leapfrogging a whole bunch of studies that happened in the meantime.
Most of the time, there is simply no "state of the art" that covers a whole field, and even in limited sub-fields, quite often there is no consensus.
Academic publishing doesn’t make the authors money, it costs them money. To publish is a requirement for academic employment, but there’s no incentive to retract, either than revenge, or one’s academic honesty.
I feel like I'm missing a joke or reference somewhere - can anyone help me out? I'm familiar with the concept of p-zombies, but that doesn't seem to be relevant here - and a "Zombie Statistic"[0] is sorta-related in that it refers to the psychological power of incorrect assertions, but I'm still struggling to see the link.
The corrections are wonderful. Concise and to the point.
The author is a powerhouse in his field, known by researchers and casual readers. He has nothing to fear from publishing corrections. Nobody is trying to step on him, keep him from promotion, or argue that his research is meaningless.
Other researchers have less privilege, and (imo) therefore will be less forthcoming.
If you only have two papers, admitting one has a fatal flaw instead of giving more talks about it will be career suicide.
Without knowing the specific context: I think this really is a good example of how errors should be disclosed.
We need to acknowledge that scientists/academics are human; even very competent mathematicians make mistakes and some of these mistakes appear in published papers. What we lack in many fields is a culture and process that allows (and ideally, encourages) one to disclose: "this was wrong, here is how I fixed it, or how it's actually correct". E.g., in the communities I know in Computer Science & AI, I rarely even see errata lists on personal webpages, not to speak of journals that provide a straightforward process for updates. I would even go so far to claim that the current culture, in which honest errors cannot be straightforwardly corrected, plays into the hands of the clearly dishonest "bad apples".
Science is, obviously, not a "monotonic" process in which every single paper adds to the truth; this is practically not even the case for mathematics, which is at least monotonic on object-level (but mistakes happen all the time). As a prominent example, consider this impressive list of Feynman errata: https://www.feynmanlectures.caltech.edu/info/flp_errata.html.
> 1999: It is not clear to us in general how to avoid this sort of false proof, the problem being that the false statement seemed so natural to us that we did not think to look at it carefully.
Assuming I understand correctly, this is basically the common issue of being unable to be objective when you’ve lived and breathed a subject matter for long enough. The answer is rigorous peer review, I think.
I really enjoyed this phrasing, from the comments on the post:
> I suspect the careerization of Big Science and Big Academia has a lot to do with [unwillingness to plainly admit error]. Although I have no proof to that effect. And certainly man has been subject to failings since Adam and Eve chomped.
This was a prominent mark of the Bush era too (and I'm sure many other Democratic/Republican administrations, it's just my example). I was rewatching Fahrenheit 9/11 because of our government's current foreign engagements, and it's striking how even then people were duped, but back then it was by lack of information, now it is by overinformation
Karl Rove's "Reality based community[1]" is still with us:
> The aide said that guys like me were 'in what we call the reality-based community,' which he defined as people who 'believe that solutions emerge from your judicious study of discernible reality.' [...] 'That's not the way the world really works anymore,' he continued. 'We're an empire now, and when we act, we create our own reality. And while you're studying that reality—judiciously, as you will—we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors...and you, all of you, will be left to just study what we do'.
What was nuts in the bush era was the bloodthirstiness both generally and in the media.
There's were so few voices that were against invading Iraq. It was just this vagueness that terrorists were everywhere and somehow Iraq was involved in 9/11. (And getting ready to nuke everything).
I mean, ffs, there was a popular TV show that spent each season detailing how the brave heros would stop the Muslim terrorists using torture (24).
Where I lived, it wasn't uncommon to hear people advocate leveling the middle east with nuclear weapons.
"Everyone else is an extremist, but I'm totally sane and justified."
When you look at "Muslim Extremists" around the world, they are inspired by the same kind of events and influences that "Christian Americans" are. Zealot leaders lying to the people to further their own aims, and the people lapping it up because they see "evidence" of oppression from "the outsiders", are indoctrinated to hate "the outsiders", and have their own problems which they're easily led to blame "the outsiders" for. A minority of people fall for it, but enough to make a lot of noise, and it's just enough that the state can seize it as a "mandate from the people" to justify its actions, and everyone else just shrugs and lets it happen.
Muslim extremists around the world are far more driven by political and land conflicts than anything else. They are far more similar to the IRA than to any Christian extremists.
ISIS is the one which seems most ideology driven. But Hamas, Hezbollah, the Taliban, the PLF, the ETLO, the various Kashmiri militants, all of these are largely a result of land conflicts, like the IRA
Heh, funnily, I know I wasn't sane. I bought the "weapons of mass destruction" and "pre-emptive strike" lines and the notion of not letting the terrorists win.
What happened, I think, is much more banal. Everyone was pissed at such a large successful terrorist attack. Politicians took that as a strong signal that they must do something. And the easiest solutions were new policing agencies, privacy invasion, and invading sovereign nations. Those were easy answers to sell to angry citizens. They were also the wrong answers that lead to hundreds of thousands dead, the region destabilized, new larger terrorist organizations (ISIS/ISIL) and trillions of dollars wasted.
The right answer just wasn't flashy. Locking cockpits, diplomacy with Afghanistan leaders, and ultimately intelligence gathering and a targeted strike on Osama bin ladin if he was found. We Americans wanted blood, what we needed was to address the holes that made the attack possible and to just move on.
Yes, Moore was extremely hated in America for the film, but his closing quote has essentially defined the modern day America's isolationist movement:
"The people who have the least, who suffer the most, are always the first to stand up to defend that very system. And all they ask us in return is that we never send them in harms way unless it's absolutely necessary. Now, they may never trust us again."
FWIW, I don't think this is really a Trump era exclusive. People have always believed whatever they wanted to believe, facts and accuracy be damned. IMHO understanding this as an inexorable human condition makes it much easier to understand the world.
What actually matters is people can be persuaded to _want_ to believe different things, so the only real leverage is in shaping those wants—not in being right.
Also, if you can get people to connect intensely with just one or two of your statements, you can then make other false statements and they won’t care, because that might invalidate the one they really want to believe. The threshold is so low that the shotgun approach of just telling lies continually actually works quite well. Your statements don’t even have to be consistent, so you can A/B test the lies.
The normal backpressure to this is that you lose face with your peers because you become known as a liar. But if your peers don’t influence your success, or you just have no peers, it works.
The paper [1] is linked in the OP. Section 3, the section invalidated by the errata, looks at data on voter preferences. So the answer to your question is "a lot", assuming that you think that "using data on the beliefs of voters" is in some way important to a paper on electoral strategy.
Perhaps its because your comment seemed to equate errors with crimes, or at least malicious intent. The language seems a bit provocative for many, detracting from whatever message was intended.
I like that they are very blunt and to the point.
It'd be really cool if we can add more social dynamic features to papers / research, and have people correct papers in the open, like twitter community notes, to the point that if you can really disprove it, the paper is now marked "wrong" or something like that on arxiv. Sort of inbetween collaborative science or hunting for patent prior art or bug bounty hunting. Getting paid for it would be even better, making money by sniping issues in research. One can dream.
I wonder how much money in bounties it'd take to proof-invalidate whole branches of psychology (insert your own pet peeve here) research in a methodical way, probably not that much.
I wish academic papers were more like Wikipedia articles. Currently what I'm working on is really "Building on" one pretty pivotal paper from the 90s, and there's a whole constellation of work that has spawned.
So much ink is spilled re-defining the problem, and reading any paper requires going through the system model every time because tons of arbitrary decisions may have been done different. It makes it hard to compare results, and makes almost every statement that reads "Over in this area we're not innovating on, we used the SOTA" wrong, because some other group is innovating in that corner.
If instead there was one canonical version of it with an edit history, and I could go try to just re-write one little para and argue in the talk section about it with the one-or-two other groups picking away at that, I feel like things could move faster and be done higher quality.
It'd also be a lot easier to peek at other areas. Currently if I have a question like "What's the latest in NeRFs underwater? I remember seeing a paper about that a while ago" I've basically got no idea.
> I wish academic papers were more like Wikipedia articles.
I don't think that would be helpful. Scientific development happens in branches, not linearly. The fact that a field is going in one direction does not mean that somebody won't make a breakthrough next year based on a poorly-cited paper from the 1970s, leapfrogging a whole bunch of studies that happened in the meantime.
Most of the time, there is simply no "state of the art" that covers a whole field, and even in limited sub-fields, quite often there is no consensus.
Tell that to businesses, and beat it over the heads of marketeers.
I too, would like to have what you’re having.
Academic publishing doesn’t make the authors money, it costs them money. To publish is a requirement for academic employment, but there’s no incentive to retract, either than revenge, or one’s academic honesty.
> Academic publishing doesn’t make the authors money > there’s no incentive to retract (... other than) or one’s academic honesty
We agree on the problems, maybe you have other wacky potential solutions other than just saying others wouldn't work!
> This entry was posted in [...] Zombies
I feel like I'm missing a joke or reference somewhere - can anyone help me out? I'm familiar with the concept of p-zombies, but that doesn't seem to be relevant here - and a "Zombie Statistic"[0] is sorta-related in that it refers to the psychological power of incorrect assertions, but I'm still struggling to see the link.
[0] https://absolutelymaybe.plos.org/2019/11/30/the-power-of-zom...
The corrections are wonderful. Concise and to the point.
The author is a powerhouse in his field, known by researchers and casual readers. He has nothing to fear from publishing corrections. Nobody is trying to step on him, keep him from promotion, or argue that his research is meaningless.
Other researchers have less privilege, and (imo) therefore will be less forthcoming.
If you only have two papers, admitting one has a fatal flaw instead of giving more talks about it will be career suicide.
Without knowing the specific context: I think this really is a good example of how errors should be disclosed. We need to acknowledge that scientists/academics are human; even very competent mathematicians make mistakes and some of these mistakes appear in published papers. What we lack in many fields is a culture and process that allows (and ideally, encourages) one to disclose: "this was wrong, here is how I fixed it, or how it's actually correct". E.g., in the communities I know in Computer Science & AI, I rarely even see errata lists on personal webpages, not to speak of journals that provide a straightforward process for updates. I would even go so far to claim that the current culture, in which honest errors cannot be straightforwardly corrected, plays into the hands of the clearly dishonest "bad apples".
Science is, obviously, not a "monotonic" process in which every single paper adds to the truth; this is practically not even the case for mathematics, which is at least monotonic on object-level (but mistakes happen all the time). As a prominent example, consider this impressive list of Feynman errata: https://www.feynmanlectures.caltech.edu/info/flp_errata.html.
> 1999: It is not clear to us in general how to avoid this sort of false proof, the problem being that the false statement seemed so natural to us that we did not think to look at it carefully.
Assuming I understand correctly, this is basically the common issue of being unable to be objective when you’ve lived and breathed a subject matter for long enough. The answer is rigorous peer review, I think.
Peer reviews are (often) also done by people who live and breathe the subject matter.
I really enjoyed this phrasing, from the comments on the post:
> I suspect the careerization of Big Science and Big Academia has a lot to do with [unwillingness to plainly admit error]. Although I have no proof to that effect. And certainly man has been subject to failings since Adam and Eve chomped.
[flagged]
This was a prominent mark of the Bush era too (and I'm sure many other Democratic/Republican administrations, it's just my example). I was rewatching Fahrenheit 9/11 because of our government's current foreign engagements, and it's striking how even then people were duped, but back then it was by lack of information, now it is by overinformation
Karl Rove's "Reality based community[1]" is still with us:
> The aide said that guys like me were 'in what we call the reality-based community,' which he defined as people who 'believe that solutions emerge from your judicious study of discernible reality.' [...] 'That's not the way the world really works anymore,' he continued. 'We're an empire now, and when we act, we create our own reality. And while you're studying that reality—judiciously, as you will—we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors...and you, all of you, will be left to just study what we do'.
1: https://en.wikipedia.org/wiki/Reality-based_community
What was nuts in the bush era was the bloodthirstiness both generally and in the media.
There's were so few voices that were against invading Iraq. It was just this vagueness that terrorists were everywhere and somehow Iraq was involved in 9/11. (And getting ready to nuke everything).
I mean, ffs, there was a popular TV show that spent each season detailing how the brave heros would stop the Muslim terrorists using torture (24).
Where I lived, it wasn't uncommon to hear people advocate leveling the middle east with nuclear weapons.
"Everyone else is an extremist, but I'm totally sane and justified."
When you look at "Muslim Extremists" around the world, they are inspired by the same kind of events and influences that "Christian Americans" are. Zealot leaders lying to the people to further their own aims, and the people lapping it up because they see "evidence" of oppression from "the outsiders", are indoctrinated to hate "the outsiders", and have their own problems which they're easily led to blame "the outsiders" for. A minority of people fall for it, but enough to make a lot of noise, and it's just enough that the state can seize it as a "mandate from the people" to justify its actions, and everyone else just shrugs and lets it happen.
Muslim extremists around the world are far more driven by political and land conflicts than anything else. They are far more similar to the IRA than to any Christian extremists.
ISIS is the one which seems most ideology driven. But Hamas, Hezbollah, the Taliban, the PLF, the ETLO, the various Kashmiri militants, all of these are largely a result of land conflicts, like the IRA
Heh, funnily, I know I wasn't sane. I bought the "weapons of mass destruction" and "pre-emptive strike" lines and the notion of not letting the terrorists win.
What happened, I think, is much more banal. Everyone was pissed at such a large successful terrorist attack. Politicians took that as a strong signal that they must do something. And the easiest solutions were new policing agencies, privacy invasion, and invading sovereign nations. Those were easy answers to sell to angry citizens. They were also the wrong answers that lead to hundreds of thousands dead, the region destabilized, new larger terrorist organizations (ISIS/ISIL) and trillions of dollars wasted.
The right answer just wasn't flashy. Locking cockpits, diplomacy with Afghanistan leaders, and ultimately intelligence gathering and a targeted strike on Osama bin ladin if he was found. We Americans wanted blood, what we needed was to address the holes that made the attack possible and to just move on.
And unfortunately some countries today are not learning the right lessons from our failures
Yes, Moore was extremely hated in America for the film, but his closing quote has essentially defined the modern day America's isolationist movement:
"The people who have the least, who suffer the most, are always the first to stand up to defend that very system. And all they ask us in return is that we never send them in harms way unless it's absolutely necessary. Now, they may never trust us again."
FWIW, I don't think this is really a Trump era exclusive. People have always believed whatever they wanted to believe, facts and accuracy be damned. IMHO understanding this as an inexorable human condition makes it much easier to understand the world.
What actually matters is people can be persuaded to _want_ to believe different things, so the only real leverage is in shaping those wants—not in being right.
Also, if you can get people to connect intensely with just one or two of your statements, you can then make other false statements and they won’t care, because that might invalidate the one they really want to believe. The threshold is so low that the shotgun approach of just telling lies continually actually works quite well. Your statements don’t even have to be consistent, so you can A/B test the lies.
The normal backpressure to this is that you lose face with your peers because you become known as a liar. But if your peers don’t influence your success, or you just have no peers, it works.
https://news.ycombinator.com/item?id=42075533 ("HN: Misinformation Does Spread Like a Virus, Epidemiology Shows")
https://theconversation.com/misinformation-really-does-sprea... ("The Conversation: Misinformation really does spread like a virus, suggest mathematical models drawn from epidemiology")
This is a generic ideological tangent.
> Please don't use Hacker News for political or ideological battle. That tramples curiosity
Please try to follow this guideline better in the future.
https://news.ycombinator.com/newsguidelines.html
It sucks how Trump invented dishonesty! What a jerk!
[dead]
[flagged]
[flagged]
The paper [1] is linked in the OP. Section 3, the section invalidated by the errata, looks at data on voter preferences. So the answer to your question is "a lot", assuming that you think that "using data on the beliefs of voters" is in some way important to a paper on electoral strategy.
[1] http://www.stat.columbia.edu/~gelman/research/published/AOAS...
[flagged]
[flagged]
> "Should the Democrats move to the left?
>
> Because of a data coding error, all of our analysis of social issues is incorrect.
Yeah, sure, that's why the analysis was incorrect, it was all about that typo...
The cover up is usually worse than the crime.
Do you think these errata being published are a good thing?
Yes of course. That's the point of my comment, which is being downvoted for some reason that eludes me.
Perhaps its because your comment seemed to equate errors with crimes, or at least malicious intent. The language seems a bit provocative for many, detracting from whatever message was intended.
Sometimes I write with "downvotes be damned" in mind and connected with the audience exactly how I intended.
But not connecting with the audience is the usual reason.
When it happens to me, I take it as feedback on my writing. Maybe I was unclear. Maybe I was wrong. Maybe it was written for a different audience.
In those cases, I just try to improve my writing.
Anyway, where can responses to your original comment go?
They could dispute your maxim and the internet gets another argument where nobody changes their mind.
Or they could agree with it and the internet gets another dog pile of cynicisms.
Generally, those are not why people come to HN...at least when the form is one-liners.
Finally, complaining about downvotes is contrary to the HN guidelines. Good luck.
Excellent, helpful, and appropriate to the context BTW.