Changing some details form obvious reasons: a friend was a postdoc in biology at a top tier UK university. Her boss was a hotshot professor, tons of revolutionary publications under his belt.
Her first job when he hired her was to hammer into shape a paper one of his PhD students was writing. Normal stuff, right? Adding literature, polishing off rough edges etc. Except the PhD student seemed really uneasy about the paper, kept talking about issues with the data that needed fixing. At face value, all was great - nice small p-values, great effect sizes etc. Eventually it transpired the PhD student was doctoring the data (these were the issues), was deeply uncomfortable with it, but also understood this is what my friend will be helping her with.
At this point my friend goes to the professor to inform him, nicely but firmly, of academic fraud. The prof comes down on my friend like a ton of bricks: who do you think you are, what do you know, I am the hotshot superstar and you are a no-name. All is well with the research and if you press on with the ridiculous accusations, she will be ruined in academia.
Well, my friend is a feisty one. She complained of bullying to the university, as an employment thing. The university was desperate not to touch it and in fact encouraged her to lie low, but she wouldn't. The whole thing kept escalating in really nasty ways, but eventually wound its way to an employment tribunal. Notably, at this point this isn't even about the academic misconduct, it's about the professor bullying his employee.
At the eleventh hour, the prof quits his job and goes to another, also top tier university. Since he is no longer my friend's boss, the case is dropped - no boss, no case. Still, however, no academic misconduct case! My friend, against all advice, wrote a letter to the new university, informing them verbatim of her findings. No response, to this day. The guy is still a top honcho there.
Whereas my friend finished her postdoc, didn't get another academic position and got an industry job, where frankly, she seems happier.
This sort of thing can be applied to other things as well. When there's a prestige or financial incentive, one can never be sure of the veracity. See literally all mainstream media.
Everything is underwritten - the writing, if you zoom in, says "our advertisers approve this segment"
So how then, can one know that they're getting actual facts or good reporting?
The problem is that people (correctly) identify problems in mainstream media, then immediately imprint on a youtuber, talk radio host, or literal enemy propagandist who they decide has the actual truth.
I can't help but feel a similarity to the entertainment industry's dynamics that sparked the meToo movement.
The issues are different of course, but there are common patterns of prestigious abusers, victims risking ostracism, etc.
I wonder if we'll end up seeing a similar movement in academia were we spend a few months watching big names taken down from their pedestal every few days.
Academia also has, or has had, a metoo movement over actual sexual harassment and bullying. And of course the two are linked; if you can bully a young female researcher into not complaining about sexual harassment, you can bully her into complicity with your fraud.
Attempts to address this are probably going to get swept away by the anti-DEI crackdown in the US. After all, not harassing your juniors is inclusivity.
I think thats a stretch. You think pro-DEI types are less likely to bully or harass? Cause Ive personally seen the exact opposite (people accusing people of racism as a means of retribution, or to advance their status, people in charge of DEI programs bullying or politicising their role, etc)
I don't know if people who will readily accuse others of racism as a tool of retribution can be considered genuinely serious about their DEI endorsement.
Now with the winds changing directions under Trump administration 2.0, the same people are probably already denouncing their past and jumping onto the "DEI is bad" bandwagon.
It's a flimsy argument for sure. But we live in a reality where Meta, Google and other past bastion of progressive movements have taken a 180 turn upon Trump's first week of reign. Is it that much of a stretch to assume that a large swath of DEI champions around us were just parroting performative mantras?
Maybe hypocrites are pretty obvious to spot for you. It's not as easy for me. I was shocked to find many of my peers turn their noses at DEI policies and act as if they've always been against it.
Bandwagoners are going to bandwagon. I personally don't see much use in labelling them as anything else.
> I was shocked to find many of my peers turn their noses at DEI policies and act as if they've always been against it.
Why were you shocked? Are you new to the workforce? Did you never live through a time when criticising a company policy[1] was a serious CareerLimitingMove(tm)?
Throw in the fact that people were both routinely and publicly punished for not jumping on the DEI bandwagon, and you have a large number of people who would rather keep their head down and get their work done in the most agreeable manner rather than tell you what they think.
People aren't going to work to have their opinions validated, they're going to work to get money. They aren't going to jeopardise that money by going against the company vision (whatever that may be).
[1] DEI or otherwise. Don't worry though, when the pendulum swings back these people would then present as supportive of DEI.
Take this comment as an extra upvote. The thing is: we need to figure out a way to make accountability to become more of a thing. Peer review isn't enough (clearly). I suspect that people should create a journal that has more strict requirements than simply peer review. I think for many forms of science, one should provide an R script with the data they used to get the results. Yes, there are still ways to fabricate stuff but it'll be more transparent. Another thing is: there needs to be an incentive for reproducible findings
I think it’s better to provide raw data and your hypotheses (in some manner) to third-party reviewers in a double blind fashion. If third-party reviewers cannot confirm your hypotheses, then it’s better to go back to the drawing board. Sometimes people are not necessarily trying to defraud, but maybe they’re simply misinterpreting or misunderstanding their results.
The people providing science funding ought to be interested in their money being spent on science. If the methods of a study are not described in sufficient detail as to enable replication, the study is simply not science. Of course, doing science requires more diligence than doing pseudoscience. That's kind of the point.
Honestly? Peer-review sucks and I'm not convinced it's all fine and good with it neither
Public review is better (but also worse in important ways)
A possible improvement would be for reviews to be semi-public but (author) anonymous, also where the reviewers would have some choice in selecting papers
she in a position to lose time, emotional stability, etc. such a pain in the ass for what? shes already been abused and trampled by the system. give her a break for not jumping at the chance to say "more, please"
> The incentives to investigate and call out fraud are non-existent. In fact, the opposite is true: If you find something fishy in a paper, your mentor, colleagues, and friends will most likely suggest that you keep quiet and move on (or as I have learned the hard way, they might even try to bully you into silence).
I can’t help but wonder if these mentors are also guilty of fabricating data and don’t want to cause issues that might lead to people looking at their own work more closely.
> I can’t help but wonder if these mentors are also guilty of fabricating data and don’t want to cause issues that might lead to people looking at their own work more closely.
It's far more likely that they know whistleblowing creates far more pain than many people expect.
People who build their reputations on top of lies tend to lash out when threatened. They know they can't afford to have their lies revealed, so they launch an all-out offensive to discredit, bully, and push out their accusers.
I've seen it play out in the workplace a couple times now. Some people get downright vicious when they feel their reputation is at risk. I watched someone spend years trying to get another person fired simply because that person accidentally uncovered one of his lies. The person who discovered it wasn't even trying to do anything with the information, but the person who lied about it was so anxious about the fact that somebody knew that he started doing increasingly irrational and desperate things to push the other person out.
Looking too closely has zero upside and plenty of downside.
If you’re wrong and the result was legitimate, you’ve developed a reputation as a pain in the ass.
If you’re right, the researcher will say it was an honest mistake (maybe a clerical error) to absolve themselves modulo retraction, which happens all the time. Nothing of significance comes of it, but an “influencer” may well have other, above board, programs of research on which you could have collaborated and made a real name for yourselves.
I'm a researcher and I'm heavily dependent on inputs from my students. If one of them is falsifying data, that's my entire career gone. I know the usual line is "no big deal, another University will just hire the PI" but I don't believe that for a second. Even if it somehow did turn out to be true, I'm going to uproot my family and go through hell because of something I could now avoid?
There are crazy people in the world. But if someone comes to me and says "here is a thing that could end, or at least seriously screw up your career" I'm going to take them very, very seriously. Even if it is incredibly horrifying to contemplate.
And the awkwardness that if said influencer is engaged in fraud then that means they have more free time, incentive and natural inclination to politic. They don't have the handicap of having to spend time worrying about solid research and they clearly care a lot about managing what other people think of them.
There is no way that's true in my research group. We would - and do - call each other out on our mistakes at ever group meeting. Straight up fraud would not be tolerated.
I think people are underestimatig the psychological impact of failing to act on fraud in your research. Once you've learned that you're a fraud and what you're doing is bullshit, it would be soul crushing to go to work every day.
I have a feeling that non-academics on the Internet have an overly bleak view of the ethics in acamedia. I don't know why, but several online influencers really like to peddle the idea that we're all corrupt.
Upvoted for the optimism. And I presume it's just a typo, but "acamedia" is wonderfully evocative for cases where academic papers are to be written primarily to provide a foundation for future popularization and commercial use.
Hm, it looks like the Greek etymology is disputed (it was named after a place rather than with purpose).
ake = silent
heka ... in borrowing normally means "one hundred", but in Greek it's actually "he" (one) + "katon" (hundred), so "heka" doesn't mean anything. (eka- in chemistry is Sanskrit and unrelated)
demos = district
medius = middle in Latin; the Greek is mesos, which seems to maintain more of the explicit "between/among" sense (for which Latin has "inter")
thats rare. i also worked in a lab that had a high degree of internal discourse freedom and low tolerance of fraud. it still took a lot of fortitude for one of the grad students to force a retraction of 4-5 papers of the lab (but it did happen). and many of my colleagues lost years of their lives by chasing other labs' misconduct. so either its a widespread problem or even the ppl in the good labs are too dimwitted to notice when things are sheisty
damn, you started off so hard-core and no-nonsense like seal team 6 had entered the research review arena, then your finishing blow to fraud was... it's soul-crushing to the fraudster?
No, to the team. Science is collaborative. If you learn that the results you are working with and building on are fraudulent, what's even the point? Trying to replicate that team members results with it never workong, starting new projects designed based off of fake data, continuing to tolerate and endorse that fraud by keeping quiet means the entire group's work becomes meaningless. Fraud infects all the research that references it and builds on it. Would you want to spend 5 years of your life futilely trying to build on your collaborators work if you know at the start that their data is fake?
> There is no way that's true in my research group.
Well, congratulations for you, but there's a real crisis of legitimacy in some fields. Especially once they've been filtered through the pop-sci press release filter, which only makes things less accurate and more extravagant claims.
Social science and psychology have big reproduction crises and seemingly no recent usable results at all. Nutritional science seems to be compromised by the food industry. Economics blatantly has different "schools" which give different answers depending on politics. "Business science" seems barely worthy of the name of science at all.
The risk is that mistrust from those spills over into scientifically settled but politically controversial areas.
Your career is done. Academics tend to rule themselves: think peer review, but for everything. You apply for a grant, and you are in bad terms with the reviewers for some critics? You get nothing. The editors of the Journals are also scientists. Are you in bad terms with a friend of the editor? "Nice work, but it doesn't fit our journal", also let me send some emails to get sure you don't get published in first decile, or your paper gets slowed down for months.
You deserve the Nobel prize, but you critisized the work of someone? You get nothing! Swedish chemist Arrhenius blocked the Nobel price for Mendeleev because the later make some valid scientific critics the former didn't like.
Perhaps relevant to a lot of HN readers - the culture of conference publishing in much of CS (I'm in computer systems myself) seems to result in significantly different dynamics.
Journal papers are single-blind, i.e. the reviewers know the names of the authors and can make decisions based on reputation, so if you publish great (e.g. faked) results once, it becomes progressively easier to do it again.
Reviewing for conference papers is typically double-blind - we don't know the identity of the authors, and although we can try to guess, a couple of studies show we're wrong most of the time. Most papers (65-85% at good conferences) get rejected, so it's easy to flush something that looks fishy. (then again, it's hard to tell the difference between someone playing games with their numbers and someone who just has the typical crappy CS systems understanding of statistics) The addition of artifact evaluation to a lot of conferences provides another avenue to flush out faked results, too.
However conference reviewing is vulnerable to reviewing fraud, since reviewers and authors typically come from the same relatively small community - see https://news.ycombinator.com/item?id=23459978, or just google for the words "computer science reviewing cabal".
Reviewing fraud might not be quite as destructive in the long term, as the papers that get published through such fraud are probably more crappy than faked, and will hopefully not get cited. In addition, when reviewing fraud is uncovered it's much more of an open-and-shut case - a single person can fake data and make it look a lot like good data, but review fraud requires communication between people who are explicitly barred from communicating on that topic.
A lot of it depends on the community, and the size of the community. When you have 1000+ reviewers for a conference things will always get out of hand - I know of someone who got his cat registered as a reviewer for a particular SIG conference that he hated, just to make a point. I assume one could get a lot of fraudulent research published in that particular conference.
Sometimes, even a whole conference gets compromised, see e.g. ICIMTECH '21 [1]:
"NOTICE OF RETRACTION: While investigating potential publication-related misconduct in connection with the ICIMTech 2021 Conference Proceedings, serious concerns were raised that cast doubt on the integrity of the peer-review process and all papers published in the Proceedings of this Conference. The integrity of the entire Conference has been called into question. As a result, of its investigation, ACM has decided to retract the Entire Conference Proceedings and all related papers from the ACM Digital Library.
None of the papers from this Proceeding should be cited in the literature because of the questionable integrity of the peer review process for this Conference."
Oh my, the titles in those proceedings are hilarious.
> Retracted on February 24, 2022: Innovation Research on College Student Affairs Management Based on Big Data Environment
> Retracted on February 24, 2022: University Students in the Greater Bay Area Conduct Research on the Construction of Big Data Teams for Cross-border E-commerce Startups in the Middle East
> Retracted on February 24, 2022: Discussion on International Trade Governance under the Background of Big Data
> Retracted on February 24, 2022: Application of Incoterms in International Engineering Based on Information Platform
> Retracted on February 24, 2022: The Fixed-point Tracking Model of Volunteer Service for College Students of Traditional Chinese Medicine Based on Information Technology
> Retracted on February 24, 2022: Sichuan Consumers' Mining of Agricultural Product Brand Value Based on Big Data
> Retracted on February 24, 2022: Sports Consumer Behavior Based on Integrated Data in the Context of Big Data
> Retracted on February 24, 2022: Talent Heights Construction from the Perspective of Talent Concentration
I agree about CS reviewing for conferences (esp. in systems) - pretty reasonable for the most part. Also, if things just don't work, they don't work - which is a bit different than studies in social science.
Friend of mine was working in cryptography. His PhD topic was about proving that a cryptographic scheme was secure (for some definition). It took him two months to break the scheme. Sadly, said scheme was key to the funding of his PhD advisor, so the advisor tried to prevent him from publishing findings.
Things end mostly well, with my friend publishing, then restarting his PhD with a different advisor, then leaving academia entirely.
I have always wondered if the threshold for 'number of papers published' to get a professorship is higher than the number of high quality papers that can reasonably published in that amount of time. So, people cut corners and publish bad papers or commit outright fraud. Because they do that, this keeps the required number of publications unreasonably high.
>Peter Higgs, the British physicist who gave his name to the Higgs boson, believes no university would employ him in today's academic system because he would not be considered "productive" enough.
>The emeritus professor at Edinburgh University, who says he has never sent an email, browsed the internet or even made a mobile phone call, published fewer than 10 papers after his groundbreaking work, which identified the mechanism by which subatomic material acquires mass, was published in 1964.
>He doubts a similar breakthrough could be achieved in today's academic culture, because of the expectations on academics to collaborate and keep churning out papers. He said: "It's difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964."
>Speaking to the Guardian en route to Stockholm to receive the 2013 Nobel prize for science, Higgs, 84, said he would almost certainly have been sacked had he not been nominated for the Nobel in 1980.
Yes. The bureaucratic systems put into place to increase "quality" and "output" are incoherent, so they create even more incentives for fraud. It's paper Stakhanovism.
Aviation once had a culture of strict deference to senior pilots, leading to fatal mistakes when juniors hesitated to question decisions. This changed with Crew Resource Management (CRM), which institutionalized open communication and made questioning a normal part of operations.
Perhaps academia could benefit from a similar culture shift that actively invites scrutiny to improve science overall? The article implies a deference and bullying problem in research, where junior scholars hesitate to challenge authority, even when they notice flaws. Fraud wouldn't be the direct focus of such a shift, but in an environment where questioning is encouraged, it would naturally be harder to commit, with fewer places for bad actors to hide.
> Instead, they repeatedly suggested that a scientific criticism of a published paper had no place in a dissertation
Why not a) finish the dissertation as soon as humanly possible, and b) now that you're bona fide, publish a separate paper disputing the famous paper which your dissertation didn't rely on?
Instead, it sounds like the student and committee advisors were a bad fit and spent most the time in a pissing contest.
That's not really possible. If you're writing a dissertation and find that some of the important foundational work you're building upon is wrong you can't just ignore it. If it's wrong, you need to say it's wrong in order to justify why you've chosen not to consider it in your dissertation. If you can't claim it's wrong, then you have to use it in your dissertation otherwise you're going to leave yourself open to criticsm that your dissertation ignores important prior work in the field. So the only choice you have left is to write your dissertation built upon work that you know is trash, and then if you do later choose to publish criticism of that work you're essentially trashing your own dissertation. And of course, none of that matters, because you're still going to face professional consequences when you choose to write the separate paper disputing the famous paper.
> Why not a) finish the dissertation as soon as humanly possible, and b) now that you're bona fide, publish a separate paper disputing the famous paper which your dissertation didn't rely on?
The last thing you want to do before pointing out that the emperor has no clothes is get a job at the palace.
(Re)-Read their story[1]-- they were fighting to publish the refutation as the first chapter of their dissertation. They wanted to publish no matter what.
Given that, choose your sword:
1. Grad Sword that must be approved by a dissertation committee
2. Master (well, PhD) Sword where you choose your own adventure
If you reply #1 then I'm logged in to a fake HN site.
As ever I think the bigger problem is grey areas, fooling oneself, etc rather than outright fraud. If you really are self-interested and greedy, you’ll probably choose another career than academia.
Woit does do a bit of grey area coverage. Likewise, his academic pubs are more interesting to me, personally! His reputation, however, is mixed at best. A real hero!
Gelman: maybe he's trying to subdue the neuroticism*. I look forward to more compassionate, less overtly "curiosity"-driven grey-area coverage (not to mention stats papers) from him!
Here is recent one, sadly also about psychologists (would be more interesting to do it to card carrying statisticians)
*Not many people describe themselves as an exponent of radical self-honesty since radicality isn't a palatable, much less enjoyable, thing. It takes a real saint (or logician) not to reflexively flinch from the imagined tsunami of incoming pity.
Sorry, was trying to be generous with "mixed rep at best". He is reasonably active (on arxiv,say, which is a red flag I suppose). Just that his blog gets far more attention. I would say that anybody in theory more active than Woit would immediately arouse my suspicion :)
Eh, very few fraudsters start with a plan to commit fraud. More often than not, it's legitimate intent, but then thoughts happen along the lines of "I could make my ten years of work much more effective by just being a tiny little bit dishonest!" And then it becomes, "but now I have to lie about this other slightly bigger thing too, else it will become suspicious!" And eventually, "now that I've already lied so much, I might just make stuff up."
Particularly in research, where you may be working on the same idea for decades, it can be damning when you realize that the idea might just be fruitless. If all you had to do was remove one or two zeroes, and finally your colleagues would praise instead of mock you, wouldn't that be great?
Example on a smaller scale - I remember how at uni people who did none of the work could end up with better grades. I remember toiling weeks on a hard "group" project alone, and then my groupmates who contributed nothing got better grades - because from project we all got the same one, and from exams I got worse ones as I had no time to study well.
There is always a push to do original research. Maybe a partial solution here is that it is encouraged, even required, to take up a major and a minor specialty. The major one you do original research and the minor one you work to reproduce existing research. Building that in from the start would go a long way of creating a culture of trust but verify in these fields.
Not sure how it is in all fields but at least in the molecular bio world there's a lot more replication than meets the eye. Especially methods, it's just that there won't be a replication paper, it'll be in the methods/supplementary info of the new research paper ("we used xyz techniques as described in researcher et al").
Null publishing bias still exists and obviously it's not remotely a perfect system. However, it's not some sort of completely broken system where every paper is either fraud or replication, and interventions that start from that sort of an assumption (people are very aware of the need to trust but verify) are in my opinion unlikely to be useful.
If the world is at a 60/100, you need to focus on things that get you from 60 to 80, not from 0 to 20
Seems like wishful thinking, much like saying engineers at Amazon can do whatever they want despite what Bezos says. Yes, they can, but they'd get fired. And yes, incentives could be changed, but they won't, because the person running the organization doesn't want to.
Research funding and research institutions are existing things, that operate in specific ways. One can lobby to change this, but unless you want to go back to the "gentleman scientist" model of the 1700s and 1800s, and a scientific establishment about 1% the size of the one today, you can't just buck the system.
I'm so glad that Zoe Ziani essentially has the last word on this subject.
For what it's worth, while I'm sure that her advisors are known within that research community, I believe they deserve to be named and shamed. Their behavior, IMO, is almost as bad as the original fraud to begin with. It was classic "circling the wagons" bullying, and their scientific reputations absolutely deserve the hit that their atrocious behavior warrants.
I want to emphasize that this isn't about "retribution" or getting back at these people, it's about publicizing that their behavior was wrong, the antithesis of what scientific inquiry is about, and was fundamentally rooted in a cowardice that put professional standing on a higher pedestal than scientific integrity. Actions have consequences, and these advisors should face consequences for their abhorrent actions.
Fraud is basically the only career path in our current fake economy. Everyone is committing fraud on some form or another. Your company, you personally, your government, etc.
As sensationalist as this sounds I agree. I think a major turning point was when tech became a big business by charging nothing.
1. It’s impossible to compete with free so business that do add real value are at a disadvantage
2. A future where tech adoption slowly was accepted and people eased into being comfortable actually paying for things the value never happened and now feels impossible, as free is expected.
3. It’s created bad incentives. Like keeping people on their devices as much as possible and only show them things that enrage them or confirm their prejudices.
4. None of it is really free because someone pays for it all and those costs are born by society anyways, while having less choice.
5. I think over time this ethos has become mainstream and it’s essentially a fraud mindset. Nothing matters as long as the line goes up.
(This process and the end results are surprisingly similar to academic fraud. “Everything is free” is the same thing as inventing whatever results people want to hear. Both will ultimately lead to the same, unsustainable end result, as people will no longer trust research because everything is fake.)
Even though I have very high moral standards and refuse to work at shady places, almost every place I have ever worked, and places people I know have worked, have committed some kind of fraud. I don’t know if they all met the legal definition of fraud, but they are all fraud in my book.
I’ve seen fudging analytics and subscriber numbers to lie to ad buyers. I’ve seen people intentionally hold items over from one quarter to the next for accounting purposes. I’ve seen events use dubious counting methods to inflate their attendance figures. I’ve even heard stories about hospitals moving patients from failed surgeries back to their floor before they die in order to fudge the surgery survival stats.
A lot of this is Goodhart’s law in action. But also, when these very tiny frauds go unchecked in a competitive marketplace, everyone becomes forced to do them. If law enforcement won’t punish them for being evil, the market will punish them for being good.
Bingo. This is why in the latest writers guild / SAG strike, the studios would not budge on transparency for streaming residuals. They've lied too much to let the true numbers of streams for their content be seen.
It's ironic that one of Gino's partners in crime, Dan Ariely, has been writing bestselling pop-sci books about dishonesty, based on his own faked data.
Maybe for the top 1 or 10% folks out there who are good enough at fraud to actually reliably get away with it. For the rest of us, I don't think this is true.
That’s the saddest part, you’d think that, but the fraud here (and other similar cases) is so blatant it’s painful.
Modern academia has terrible incentives. The easy path to success comes from not rocking the boat, even the guys at Data colada (who did the leg work to prove the case) have made a lot of enemies in the field.
I’m willing to bet there’s a large amount of less significant fraud that gets better defended, it’s only the people who are so successful they let they guard down that get caught in this silliness.
Maybe we should modernize the publishing system for information age. So papers are no longer published in a sense of permanent fixed event but rather tentatively put up online, from where they can be taken down at any point autonomusly by third parties, if they can't be replicated, without any say of the original "publisher".
Realying on potentially fraudulent originator that put up the thing to take it down is apparently too much.
There’s a lot that sucks about publishing in security and privacy. But wow this article makes me super grateful that we are still allowed to write attack papers and critique famous works.
Science still works if the newly found knowledge can somehow be transformed into technology. Technology is perhaps the strongest form of reproducibility there is. This is true for even social science as it will make its way into novel marketing approaches that clearly generate more revenue
I'm not sure "pre-scientific" is a meaningful distinction. Maybe "pre-scientific-method"?
Certainly one major problem is: although we live after the scientific method was developed, things can still be done without actually following its principles.
The Scientific method exists because of fraud, confusion, and other issues.
Like any set of tools, it has to be actually applied. There is a long and storied history of scientific advances and discoveries being fraudulent, because there are many incentives for fraud - or just plain delusion. Delulu is the solulu has always been tempting.
There is nothing missing from the toolbox here. It just has to be applied.
It's not even that bad of a system. The Roman Republic conquered the civilized world with a political system based on formalized clientelism.
And anyone who's seen "honest" public procurement will see the benefit to a system where the governor can call his friend and have a railway built by the end of the year.
I don't think this is incorrect but it's definitely a grey area to spell it this way. It would be ironic to be downvoted for pointing it out based on the intent of the article.
I always understood and used that idiom a little differently. I think of it as more about appearances vs internals —- e.g. imposter syndrome and the time it takes to develop skills fully. Nobody knows what they are doing on their first day a new job, but they have the title already.
This article is ironic given the expulsion of many students who expressed support for Palestine. The article goes into why a student's dissertation needed another defense meanwhile they are arresting and banning students en masse.
Changing some details form obvious reasons: a friend was a postdoc in biology at a top tier UK university. Her boss was a hotshot professor, tons of revolutionary publications under his belt.
Her first job when he hired her was to hammer into shape a paper one of his PhD students was writing. Normal stuff, right? Adding literature, polishing off rough edges etc. Except the PhD student seemed really uneasy about the paper, kept talking about issues with the data that needed fixing. At face value, all was great - nice small p-values, great effect sizes etc. Eventually it transpired the PhD student was doctoring the data (these were the issues), was deeply uncomfortable with it, but also understood this is what my friend will be helping her with.
At this point my friend goes to the professor to inform him, nicely but firmly, of academic fraud. The prof comes down on my friend like a ton of bricks: who do you think you are, what do you know, I am the hotshot superstar and you are a no-name. All is well with the research and if you press on with the ridiculous accusations, she will be ruined in academia.
Well, my friend is a feisty one. She complained of bullying to the university, as an employment thing. The university was desperate not to touch it and in fact encouraged her to lie low, but she wouldn't. The whole thing kept escalating in really nasty ways, but eventually wound its way to an employment tribunal. Notably, at this point this isn't even about the academic misconduct, it's about the professor bullying his employee.
At the eleventh hour, the prof quits his job and goes to another, also top tier university. Since he is no longer my friend's boss, the case is dropped - no boss, no case. Still, however, no academic misconduct case! My friend, against all advice, wrote a letter to the new university, informing them verbatim of her findings. No response, to this day. The guy is still a top honcho there.
Whereas my friend finished her postdoc, didn't get another academic position and got an industry job, where frankly, she seems happier.
The very last line of your story is why the “Market for Lemons” paper (https://en.m.wikipedia.org/wiki/The_Market_for_Lemons) remains so powerful.
If fraudsters cannot be found or punished, it is logical to assume that everyone in the field is a fraud.
This sort of thing can be applied to other things as well. When there's a prestige or financial incentive, one can never be sure of the veracity. See literally all mainstream media.
Everything is underwritten - the writing, if you zoom in, says "our advertisers approve this segment"
So how then, can one know that they're getting actual facts or good reporting?
What is non-mainstream?
The problem is that people (correctly) identify problems in mainstream media, then immediately imprint on a youtuber, talk radio host, or literal enemy propagandist who they decide has the actual truth.
Non mainstream are they guys who are accidentally being directly paid by Russia...
Find the blockheads. (From Samuel Johnson's "no man but a blockhead ever wrote, except for money".
I can't help but feel a similarity to the entertainment industry's dynamics that sparked the meToo movement.
The issues are different of course, but there are common patterns of prestigious abusers, victims risking ostracism, etc.
I wonder if we'll end up seeing a similar movement in academia were we spend a few months watching big names taken down from their pedestal every few days.
Academia also has, or has had, a metoo movement over actual sexual harassment and bullying. And of course the two are linked; if you can bully a young female researcher into not complaining about sexual harassment, you can bully her into complicity with your fraud.
Attempts to address this are probably going to get swept away by the anti-DEI crackdown in the US. After all, not harassing your juniors is inclusivity.
I think thats a stretch. You think pro-DEI types are less likely to bully or harass? Cause Ive personally seen the exact opposite (people accusing people of racism as a means of retribution, or to advance their status, people in charge of DEI programs bullying or politicising their role, etc)
I don't know if people who will readily accuse others of racism as a tool of retribution can be considered genuinely serious about their DEI endorsement.
Now with the winds changing directions under Trump administration 2.0, the same people are probably already denouncing their past and jumping onto the "DEI is bad" bandwagon.
no true scotsmen, eh
It's a flimsy argument for sure. But we live in a reality where Meta, Google and other past bastion of progressive movements have taken a 180 turn upon Trump's first week of reign. Is it that much of a stretch to assume that a large swath of DEI champions around us were just parroting performative mantras?
Maybe hypocrites are pretty obvious to spot for you. It's not as easy for me. I was shocked to find many of my peers turn their noses at DEI policies and act as if they've always been against it.
Bandwagoners are going to bandwagon. I personally don't see much use in labelling them as anything else.
> I was shocked to find many of my peers turn their noses at DEI policies and act as if they've always been against it.
Why were you shocked? Are you new to the workforce? Did you never live through a time when criticising a company policy[1] was a serious CareerLimitingMove(tm)?
Throw in the fact that people were both routinely and publicly punished for not jumping on the DEI bandwagon, and you have a large number of people who would rather keep their head down and get their work done in the most agreeable manner rather than tell you what they think.
People aren't going to work to have their opinions validated, they're going to work to get money. They aren't going to jeopardise that money by going against the company vision (whatever that may be).
[1] DEI or otherwise. Don't worry though, when the pendulum swings back these people would then present as supportive of DEI.
Take this comment as an extra upvote. The thing is: we need to figure out a way to make accountability to become more of a thing. Peer review isn't enough (clearly). I suspect that people should create a journal that has more strict requirements than simply peer review. I think for many forms of science, one should provide an R script with the data they used to get the results. Yes, there are still ways to fabricate stuff but it'll be more transparent. Another thing is: there needs to be an incentive for reproducible findings
Unfortunately, I don't know how to go about this
I think it’s better to provide raw data and your hypotheses (in some manner) to third-party reviewers in a double blind fashion. If third-party reviewers cannot confirm your hypotheses, then it’s better to go back to the drawing board. Sometimes people are not necessarily trying to defraud, but maybe they’re simply misinterpreting or misunderstanding their results.
The issue is, who is going to prefer that paper over the existing ones?
You’ll have higher costs, and will you have higher returns?
The people providing science funding ought to be interested in their money being spent on science. If the methods of a study are not described in sufficient detail as to enable replication, the study is simply not science. Of course, doing science requires more diligence than doing pseudoscience. That's kind of the point.
And I think that is really the crux. I don’t see it getting better for awhile either.
We need to get a lot more cynicism in society to counteract the fraud.
How many of the people providing the funding are interested in science, and how many simply want an outcome which furthers their ends?
Honestly? Peer-review sucks and I'm not convinced it's all fine and good with it neither
Public review is better (but also worse in important ways)
A possible improvement would be for reviews to be semi-public but (author) anonymous, also where the reviewers would have some choice in selecting papers
Maybe your friend could reach out to people like Elisabeth Bik, who blogs here:
https://scienceintegritydigest.com/
Alternatively, maybe the people behind the Data Colada blog can help:
https://datacolada.org/
If she has nothing to lose now, why not release the story publicly?
she in a position to lose time, emotional stability, etc. such a pain in the ass for what? shes already been abused and trampled by the system. give her a break for not jumping at the chance to say "more, please"
> The incentives to investigate and call out fraud are non-existent. In fact, the opposite is true: If you find something fishy in a paper, your mentor, colleagues, and friends will most likely suggest that you keep quiet and move on (or as I have learned the hard way, they might even try to bully you into silence).
I can’t help but wonder if these mentors are also guilty of fabricating data and don’t want to cause issues that might lead to people looking at their own work more closely.
> I can’t help but wonder if these mentors are also guilty of fabricating data and don’t want to cause issues that might lead to people looking at their own work more closely.
It's far more likely that they know whistleblowing creates far more pain than many people expect.
People who build their reputations on top of lies tend to lash out when threatened. They know they can't afford to have their lies revealed, so they launch an all-out offensive to discredit, bully, and push out their accusers.
I've seen it play out in the workplace a couple times now. Some people get downright vicious when they feel their reputation is at risk. I watched someone spend years trying to get another person fired simply because that person accidentally uncovered one of his lies. The person who discovered it wasn't even trying to do anything with the information, but the person who lied about it was so anxious about the fact that somebody knew that he started doing increasingly irrational and desperate things to push the other person out.
Prisoner's dilemma. The one person who goes out on a limb gets punished, unless everyone around them does it too.
Interpersonal connections can matter more than results for one’s career. Spotting an influencer’s error risks closing important doors.
Looking too closely has zero upside and plenty of downside.
If you’re wrong and the result was legitimate, you’ve developed a reputation as a pain in the ass.
If you’re right, the researcher will say it was an honest mistake (maybe a clerical error) to absolve themselves modulo retraction, which happens all the time. Nothing of significance comes of it, but an “influencer” may well have other, above board, programs of research on which you could have collaborated and made a real name for yourselves.
I'm a researcher and I'm heavily dependent on inputs from my students. If one of them is falsifying data, that's my entire career gone. I know the usual line is "no big deal, another University will just hire the PI" but I don't believe that for a second. Even if it somehow did turn out to be true, I'm going to uproot my family and go through hell because of something I could now avoid?
There are crazy people in the world. But if someone comes to me and says "here is a thing that could end, or at least seriously screw up your career" I'm going to take them very, very seriously. Even if it is incredibly horrifying to contemplate.
And the awkwardness that if said influencer is engaged in fraud then that means they have more free time, incentive and natural inclination to politic. They don't have the handicap of having to spend time worrying about solid research and they clearly care a lot about managing what other people think of them.
There is no way that's true in my research group. We would - and do - call each other out on our mistakes at ever group meeting. Straight up fraud would not be tolerated.
I think people are underestimatig the psychological impact of failing to act on fraud in your research. Once you've learned that you're a fraud and what you're doing is bullshit, it would be soul crushing to go to work every day.
I have a feeling that non-academics on the Internet have an overly bleak view of the ethics in acamedia. I don't know why, but several online influencers really like to peddle the idea that we're all corrupt.
Upvoted for the optimism. And I presume it's just a typo, but "acamedia" is wonderfully evocative for cases where academic papers are to be written primarily to provide a foundation for future popularization and commercial use.
Hm, it looks like the Greek etymology is disputed (it was named after a place rather than with purpose).
ake = silent
heka ... in borrowing normally means "one hundred", but in Greek it's actually "he" (one) + "katon" (hundred), so "heka" doesn't mean anything. (eka- in chemistry is Sanskrit and unrelated)
demos = district
medius = middle in Latin; the Greek is mesos, which seems to maintain more of the explicit "between/among" sense (for which Latin has "inter")
> There is no way that's true in my research
thats rare. i also worked in a lab that had a high degree of internal discourse freedom and low tolerance of fraud. it still took a lot of fortitude for one of the grad students to force a retraction of 4-5 papers of the lab (but it did happen). and many of my colleagues lost years of their lives by chasing other labs' misconduct. so either its a widespread problem or even the ppl in the good labs are too dimwitted to notice when things are sheisty
>Straight up fraud would not be tolerated.
damn, you started off so hard-core and no-nonsense like seal team 6 had entered the research review arena, then your finishing blow to fraud was... it's soul-crushing to the fraudster?
No, to the team. Science is collaborative. If you learn that the results you are working with and building on are fraudulent, what's even the point? Trying to replicate that team members results with it never workong, starting new projects designed based off of fake data, continuing to tolerate and endorse that fraud by keeping quiet means the entire group's work becomes meaningless. Fraud infects all the research that references it and builds on it. Would you want to spend 5 years of your life futilely trying to build on your collaborators work if you know at the start that their data is fake?
> There is no way that's true in my research group.
Well, congratulations for you, but there's a real crisis of legitimacy in some fields. Especially once they've been filtered through the pop-sci press release filter, which only makes things less accurate and more extravagant claims.
Social science and psychology have big reproduction crises and seemingly no recent usable results at all. Nutritional science seems to be compromised by the food industry. Economics blatantly has different "schools" which give different answers depending on politics. "Business science" seems barely worthy of the name of science at all.
The risk is that mistrust from those spills over into scientifically settled but politically controversial areas.
I mean, have you ever seen things work out for a whistleblower?
At best, they get some 15 minutes of praise. At worst, straight to jail.
I don’t work anywhere near academia. I still wouldn’t blow the whistle or advise others to do so.
I'm not sure if Elisabeth Bik is exactly a whistleblower but she's doing ok https://www.technologynetworks.com/biopharma/news/theres-fra...
>Bik’s venture into scientific sleuthing began as a hobby when she detected duplicated western blot images in a PhD thesis.
Now full time.
>"At worst, straight to jail."
The "at worst" is worse: Two recent Boeing whistleblowers died in rather suspicious circumstances.
I'd rather get a bullet in the head than spend any substantial amount of time in an American prison.
Academic whistle-blower won't go to jail. Government ones do and big companies ones do.
Your career is done. Academics tend to rule themselves: think peer review, but for everything. You apply for a grant, and you are in bad terms with the reviewers for some critics? You get nothing. The editors of the Journals are also scientists. Are you in bad terms with a friend of the editor? "Nice work, but it doesn't fit our journal", also let me send some emails to get sure you don't get published in first decile, or your paper gets slowed down for months.
You deserve the Nobel prize, but you critisized the work of someone? You get nothing! Swedish chemist Arrhenius blocked the Nobel price for Mendeleev because the later make some valid scientific critics the former didn't like.
Which company whistle-blower has gone to jail?
The Libor bankers, for example [0].
Were they completely innocent? Nobody is completely innocent, Cardinal Richelieu knew that. But they were definitely whistleblowers who went to jail.
[0] https://www.bbc.com/news/business-60561679
Perhaps relevant to a lot of HN readers - the culture of conference publishing in much of CS (I'm in computer systems myself) seems to result in significantly different dynamics.
Journal papers are single-blind, i.e. the reviewers know the names of the authors and can make decisions based on reputation, so if you publish great (e.g. faked) results once, it becomes progressively easier to do it again.
Reviewing for conference papers is typically double-blind - we don't know the identity of the authors, and although we can try to guess, a couple of studies show we're wrong most of the time. Most papers (65-85% at good conferences) get rejected, so it's easy to flush something that looks fishy. (then again, it's hard to tell the difference between someone playing games with their numbers and someone who just has the typical crappy CS systems understanding of statistics) The addition of artifact evaluation to a lot of conferences provides another avenue to flush out faked results, too.
However conference reviewing is vulnerable to reviewing fraud, since reviewers and authors typically come from the same relatively small community - see https://news.ycombinator.com/item?id=23459978, or just google for the words "computer science reviewing cabal".
Reviewing fraud might not be quite as destructive in the long term, as the papers that get published through such fraud are probably more crappy than faked, and will hopefully not get cited. In addition, when reviewing fraud is uncovered it's much more of an open-and-shut case - a single person can fake data and make it look a lot like good data, but review fraud requires communication between people who are explicitly barred from communicating on that topic.
A lot of it depends on the community, and the size of the community. When you have 1000+ reviewers for a conference things will always get out of hand - I know of someone who got his cat registered as a reviewer for a particular SIG conference that he hated, just to make a point. I assume one could get a lot of fraudulent research published in that particular conference.
Sometimes, even a whole conference gets compromised, see e.g. ICIMTECH '21 [1]:
"NOTICE OF RETRACTION: While investigating potential publication-related misconduct in connection with the ICIMTech 2021 Conference Proceedings, serious concerns were raised that cast doubt on the integrity of the peer-review process and all papers published in the Proceedings of this Conference. The integrity of the entire Conference has been called into question. As a result, of its investigation, ACM has decided to retract the Entire Conference Proceedings and all related papers from the ACM Digital Library.
None of the papers from this Proceeding should be cited in the literature because of the questionable integrity of the peer review process for this Conference."
[1] https://dl.acm.org/doi/proceedings/10.1145/3465631
Oh my, the titles in those proceedings are hilarious.
> Retracted on February 24, 2022: Innovation Research on College Student Affairs Management Based on Big Data Environment
> Retracted on February 24, 2022: University Students in the Greater Bay Area Conduct Research on the Construction of Big Data Teams for Cross-border E-commerce Startups in the Middle East
> Retracted on February 24, 2022: Discussion on International Trade Governance under the Background of Big Data
> Retracted on February 24, 2022: Application of Incoterms in International Engineering Based on Information Platform
> Retracted on February 24, 2022: The Fixed-point Tracking Model of Volunteer Service for College Students of Traditional Chinese Medicine Based on Information Technology
> Retracted on February 24, 2022: Sichuan Consumers' Mining of Agricultural Product Brand Value Based on Big Data
> Retracted on February 24, 2022: Sports Consumer Behavior Based on Integrated Data in the Context of Big Data
> Retracted on February 24, 2022: Talent Heights Construction from the Perspective of Talent Concentration
Sounds like they were generated by SCIgen.
I agree about CS reviewing for conferences (esp. in systems) - pretty reasonable for the most part. Also, if things just don't work, they don't work - which is a bit different than studies in social science.
Friend of mine was working in cryptography. His PhD topic was about proving that a cryptographic scheme was secure (for some definition). It took him two months to break the scheme. Sadly, said scheme was key to the funding of his PhD advisor, so the advisor tried to prevent him from publishing findings.
Things end mostly well, with my friend publishing, then restarting his PhD with a different advisor, then leaving academia entirely.
So it didn't end well. Your friend had no career in academia because he criticized his master's work
He seems happy with his current work, so I'd say it ends well for him.
Academia did lose one (more) brilliant researcher, though, so it doesn't end well for academia.
I have always wondered if the threshold for 'number of papers published' to get a professorship is higher than the number of high quality papers that can reasonably published in that amount of time. So, people cut corners and publish bad papers or commit outright fraud. Because they do that, this keeps the required number of publications unreasonably high.
Peter Higgs would agree:
https://www.theguardian.com/science/2013/dec/06/peter-higgs-...
>Peter Higgs, the British physicist who gave his name to the Higgs boson, believes no university would employ him in today's academic system because he would not be considered "productive" enough.
>The emeritus professor at Edinburgh University, who says he has never sent an email, browsed the internet or even made a mobile phone call, published fewer than 10 papers after his groundbreaking work, which identified the mechanism by which subatomic material acquires mass, was published in 1964.
>He doubts a similar breakthrough could be achieved in today's academic culture, because of the expectations on academics to collaborate and keep churning out papers. He said: "It's difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964."
>Speaking to the Guardian en route to Stockholm to receive the 2013 Nobel prize for science, Higgs, 84, said he would almost certainly have been sacked had he not been nominated for the Nobel in 1980.
Yes. The bureaucratic systems put into place to increase "quality" and "output" are incoherent, so they create even more incentives for fraud. It's paper Stakhanovism.
Aviation once had a culture of strict deference to senior pilots, leading to fatal mistakes when juniors hesitated to question decisions. This changed with Crew Resource Management (CRM), which institutionalized open communication and made questioning a normal part of operations.
Perhaps academia could benefit from a similar culture shift that actively invites scrutiny to improve science overall? The article implies a deference and bullying problem in research, where junior scholars hesitate to challenge authority, even when they notice flaws. Fraud wouldn't be the direct focus of such a shift, but in an environment where questioning is encouraged, it would naturally be harder to commit, with fewer places for bad actors to hide.
> Instead, they repeatedly suggested that a scientific criticism of a published paper had no place in a dissertation
Why not a) finish the dissertation as soon as humanly possible, and b) now that you're bona fide, publish a separate paper disputing the famous paper which your dissertation didn't rely on?
Instead, it sounds like the student and committee advisors were a bad fit and spent most the time in a pissing contest.
That's not really possible. If you're writing a dissertation and find that some of the important foundational work you're building upon is wrong you can't just ignore it. If it's wrong, you need to say it's wrong in order to justify why you've chosen not to consider it in your dissertation. If you can't claim it's wrong, then you have to use it in your dissertation otherwise you're going to leave yourself open to criticsm that your dissertation ignores important prior work in the field. So the only choice you have left is to write your dissertation built upon work that you know is trash, and then if you do later choose to publish criticism of that work you're essentially trashing your own dissertation. And of course, none of that matters, because you're still going to face professional consequences when you choose to write the separate paper disputing the famous paper.
> Why not a) finish the dissertation as soon as humanly possible, and b) now that you're bona fide, publish a separate paper disputing the famous paper which your dissertation didn't rely on?
The last thing you want to do before pointing out that the emperor has no clothes is get a job at the palace.
(Re)-Read their story[1]-- they were fighting to publish the refutation as the first chapter of their dissertation. They wanted to publish no matter what.
Given that, choose your sword:
1. Grad Sword that must be approved by a dissertation committee
2. Master (well, PhD) Sword where you choose your own adventure
If you reply #1 then I'm logged in to a fake HN site.
1: https://www.theorgplumber.com/posts/statement/
Better link (the actual post mortem): https://www.theorgplumber.com/posts/statement/
As ever I think the bigger problem is grey areas, fooling oneself, etc rather than outright fraud. If you really are self-interested and greedy, you’ll probably choose another career than academia.
* You could be self interested & greedy and like to see yourself as a famous researcher
* Some researchers make good money
* You could start honest, then realize cheating is the only good way forward given the incentives
Regarding 'cad-bloggers from Columbia:
Woit does do a bit of grey area coverage. Likewise, his academic pubs are more interesting to me, personally! His reputation, however, is mixed at best. A real hero!
Gelman: maybe he's trying to subdue the neuroticism*. I look forward to more compassionate, less overtly "curiosity"-driven grey-area coverage (not to mention stats papers) from him!
Here is recent one, sadly also about psychologists (would be more interesting to do it to card carrying statisticians)
http://stat.columbia.edu/~gelman/research/published/healing3...
*Not many people describe themselves as an exponent of radical self-honesty since radicality isn't a palatable, much less enjoyable, thing. It takes a real saint (or logician) not to reflexively flinch from the imagined tsunami of incoming pity.
Grey areas-- you can safely explore
Does Woit actually publish anything in well-regarded journals? My impression was that he wasn't really an active researcher at all.
Sorry, was trying to be generous with "mixed rep at best". He is reasonably active (on arxiv,say, which is a red flag I suppose). Just that his blog gets far more attention. I would say that anybody in theory more active than Woit would immediately arouse my suspicion :)
Eh, very few fraudsters start with a plan to commit fraud. More often than not, it's legitimate intent, but then thoughts happen along the lines of "I could make my ten years of work much more effective by just being a tiny little bit dishonest!" And then it becomes, "but now I have to lie about this other slightly bigger thing too, else it will become suspicious!" And eventually, "now that I've already lied so much, I might just make stuff up."
Particularly in research, where you may be working on the same idea for decades, it can be damning when you realize that the idea might just be fruitless. If all you had to do was remove one or two zeroes, and finally your colleagues would praise instead of mock you, wouldn't that be great?
Example on a smaller scale - I remember how at uni people who did none of the work could end up with better grades. I remember toiling weeks on a hard "group" project alone, and then my groupmates who contributed nothing got better grades - because from project we all got the same one, and from exams I got worse ones as I had no time to study well.
There is always a push to do original research. Maybe a partial solution here is that it is encouraged, even required, to take up a major and a minor specialty. The major one you do original research and the minor one you work to reproduce existing research. Building that in from the start would go a long way of creating a culture of trust but verify in these fields.
Not sure how it is in all fields but at least in the molecular bio world there's a lot more replication than meets the eye. Especially methods, it's just that there won't be a replication paper, it'll be in the methods/supplementary info of the new research paper ("we used xyz techniques as described in researcher et al").
Null publishing bias still exists and obviously it's not remotely a perfect system. However, it's not some sort of completely broken system where every paper is either fraud or replication, and interventions that start from that sort of an assumption (people are very aware of the need to trust but verify) are in my opinion unlikely to be useful.
If the world is at a 60/100, you need to focus on things that get you from 60 to 80, not from 0 to 20
Seems like wishful thinking, much like saying engineers at Amazon can do whatever they want despite what Bezos says. Yes, they can, but they'd get fired. And yes, incentives could be changed, but they won't, because the person running the organization doesn't want to.
Research funding and research institutions are existing things, that operate in specific ways. One can lobby to change this, but unless you want to go back to the "gentleman scientist" model of the 1700s and 1800s, and a scientific establishment about 1% the size of the one today, you can't just buck the system.
The New Yorker article from 2023 about this saga is worth a read:
https://www.newyorker.com/magazine/2023/10/09/they-studied-d...
Technical details of the fraud can be found in four famous-ish blog posts of the Data Colada group, also from 2023, beginning with:
https://datacolada.org/109
I'm so glad that Zoe Ziani essentially has the last word on this subject.
For what it's worth, while I'm sure that her advisors are known within that research community, I believe they deserve to be named and shamed. Their behavior, IMO, is almost as bad as the original fraud to begin with. It was classic "circling the wagons" bullying, and their scientific reputations absolutely deserve the hit that their atrocious behavior warrants.
I want to emphasize that this isn't about "retribution" or getting back at these people, it's about publicizing that their behavior was wrong, the antithesis of what scientific inquiry is about, and was fundamentally rooted in a cowardice that put professional standing on a higher pedestal than scientific integrity. Actions have consequences, and these advisors should face consequences for their abhorrent actions.
Fraud is basically the only career path in our current fake economy. Everyone is committing fraud on some form or another. Your company, you personally, your government, etc.
As sensationalist as this sounds I agree. I think a major turning point was when tech became a big business by charging nothing.
1. It’s impossible to compete with free so business that do add real value are at a disadvantage
2. A future where tech adoption slowly was accepted and people eased into being comfortable actually paying for things the value never happened and now feels impossible, as free is expected.
3. It’s created bad incentives. Like keeping people on their devices as much as possible and only show them things that enrage them or confirm their prejudices.
4. None of it is really free because someone pays for it all and those costs are born by society anyways, while having less choice.
5. I think over time this ethos has become mainstream and it’s essentially a fraud mindset. Nothing matters as long as the line goes up.
(This process and the end results are surprisingly similar to academic fraud. “Everything is free” is the same thing as inventing whatever results people want to hear. Both will ultimately lead to the same, unsustainable end result, as people will no longer trust research because everything is fake.)
Even though I have very high moral standards and refuse to work at shady places, almost every place I have ever worked, and places people I know have worked, have committed some kind of fraud. I don’t know if they all met the legal definition of fraud, but they are all fraud in my book.
I’ve seen fudging analytics and subscriber numbers to lie to ad buyers. I’ve seen people intentionally hold items over from one quarter to the next for accounting purposes. I’ve seen events use dubious counting methods to inflate their attendance figures. I’ve even heard stories about hospitals moving patients from failed surgeries back to their floor before they die in order to fudge the surgery survival stats.
A lot of this is Goodhart’s law in action. But also, when these very tiny frauds go unchecked in a competitive marketplace, everyone becomes forced to do them. If law enforcement won’t punish them for being evil, the market will punish them for being good.
Bingo. This is why in the latest writers guild / SAG strike, the studios would not budge on transparency for streaming residuals. They've lied too much to let the true numbers of streams for their content be seen.
It's ironic that one of Gino's partners in crime, Dan Ariely, has been writing bestselling pop-sci books about dishonesty, based on his own faked data.
World leaders are casually scamming people for millions and nobody is batting an eye.
Society has at large approved of fraud. I’m not being a lone holdout.
No it has not. It depends on sience to postpone its demise. Fraud may be internationally acceptable, but its still worse than worthless in the west.
> Fraud may be internationally acceptable, but its still worse than worthless in the west.
? I thought OP was referring to things like the presidential cryptocurrency, the ultimate endorsement of pump-and-dump fraud.
Maybe for the top 1 or 10% folks out there who are good enough at fraud to actually reliably get away with it. For the rest of us, I don't think this is true.
That’s the saddest part, you’d think that, but the fraud here (and other similar cases) is so blatant it’s painful.
Modern academia has terrible incentives. The easy path to success comes from not rocking the boat, even the guys at Data colada (who did the leg work to prove the case) have made a lot of enemies in the field.
I’m willing to bet there’s a large amount of less significant fraud that gets better defended, it’s only the people who are so successful they let they guard down that get caught in this silliness.
Not in academia— I’m curious but not invested.
> “I formed the opinion that I shouldn’t use this paper as a building block in my research. […] However, my advisor had a different view…”
> “…scientific criticism of a published paper had no place in a dissertation…”
Are these dissertation catechism?
Or clash of personalities?
Maybe we should modernize the publishing system for information age. So papers are no longer published in a sense of permanent fixed event but rather tentatively put up online, from where they can be taken down at any point autonomusly by third parties, if they can't be replicated, without any say of the original "publisher".
Realying on potentially fraudulent originator that put up the thing to take it down is apparently too much.
There’s a lot that sucks about publishing in security and privacy. But wow this article makes me super grateful that we are still allowed to write attack papers and critique famous works.
Wasn't science supposed to be a set of tools for avoiding lying to yourself about reality, on individual level and as a community?
What's missing from the toolbox?
Science still works if the newly found knowledge can somehow be transformed into technology. Technology is perhaps the strongest form of reproducibility there is. This is true for even social science as it will make its way into novel marketing approaches that clearly generate more revenue
Same could be said about pre-scientific ideas. It's a bit of a copout to judge science by such a low bar.
I'm not sure "pre-scientific" is a meaningful distinction. Maybe "pre-scientific-method"?
Certainly one major problem is: although we live after the scientific method was developed, things can still be done without actually following its principles.
In a few words, human nature wasn't built for rational thought all the time.
All of human progress is basically a struggle against common sense.
The Scientific method exists because of fraud, confusion, and other issues.
Like any set of tools, it has to be actually applied. There is a long and storied history of scientific advances and discoveries being fraudulent, because there are many incentives for fraud - or just plain delusion. Delulu is the solulu has always been tempting.
There is nothing missing from the toolbox here. It just has to be applied.
Funding, probably.
Maybe science should be anonymous? Somehow?
Or maybe we are gonna be saved by more and more of science getting depersonalized by the virtue of being created by AI?
Good luck making your tenure case!
Should tenure even be a thing?
I think we all realize this is true since a current cult leader is president. With three letters scripting his act.
Fraud.
Humans are much, much better at lying than detecting lies. Hence, fraud is a universal problem.
And similarly, corruption is a viable system of government.For a while...
It's not even that bad of a system. The Roman Republic conquered the civilized world with a political system based on formalized clientelism.
And anyone who's seen "honest" public procurement will see the benefit to a system where the governor can call his friend and have a railway built by the end of the year.
See also:
https://www.science.org/content/article/after-honesty-resear...
https://www.vox.com/future-perfect/24107889/francesca-gino-l... "In response, Gino — who has maintained her innocence — sued Harvard and the bloggers who first published the allegations, claiming she’d been defamed. "
Thankfully it appears the judge tossed out her lawsuit.
Always has been.
Not only academic.
Look at the 2008 crisis. Only one banker went to jail.
Look at Binance’s Zhao.
Look at - ahem - Trump.
Here’s six jailed bankers named in two stories:
https://www.bbc.co.uk/news/business-68595204
https://www.cityam.com/former-barclays-trader-jay-merchant-j...
This is related to the libor rigging case, not the 2008 financial crisis.
Thankfully in some cases, people do actually go to jail.
The_Secret_Ingredient_Is_Crime.gif
"enuf"
I don't think this is incorrect but it's definitely a grey area to spell it this way. It would be ironic to be downvoted for pointing it out based on the intent of the article.
I clicked on the article hoping it's about economic fraud. /s
Alas, fraud was always a viable path. The phrase "fake it till you make it" wasn't coined without reason.
I always understood and used that idiom a little differently. I think of it as more about appearances vs internals —- e.g. imposter syndrome and the time it takes to develop skills fully. Nobody knows what they are doing on their first day a new job, but they have the title already.
Columbia is a bad school (barnard college being even worse).
This article is ironic given the expulsion of many students who expressed support for Palestine. The article goes into why a student's dissertation needed another defense meanwhile they are arresting and banning students en masse.