False claims in a widely-cited paper

Perceptions of Peer Review and Publication Incentives

  • Many see peer review as weak gatekeeping: it pleases deans and publishers more than it ensures quality, and lets confirmatory or trendy work through while blocking some original work.
  • Others argue that genuinely good work usually finds a home after a few submissions, though journal “quality” matters less than outsiders assume.
  • Fast turnaround times and use of overworked or junior reviewers are cited as symptoms of a broken system.

Field Differences and Reliability

  • Several commenters claim rigor drops as you move away from math/physics toward biology, then social and management fields, where measurement, repeatability, and incentives are worse.
  • Counterpoints note that every field produces dubious results; the difference is how quickly communities detect and ignore them.
  • Some highlight that social sciences can be statistically sophisticated but suffer from ideological bias and replication problems.

Business/Management Research and Journals

  • Business and management journals are widely portrayed as especially low-credibility, with work often described as buzzword-driven, pandering, or post‑hoc rationalization.
  • The specific sustainability/management paper in question is viewed as a “management fad magnet,” eagerly cited because it validates popular policies.

Corrections, Retractions, and Accountability

  • Commenters are alarmed that the journal’s formal policy lets only authors request corrections, seen as effectively blocking fixes when authors are uncooperative.
  • Some see this as a typical volunteer‑run journal stance: editors avoid owning controversies and expect critics to publish separate comments or replications.
  • Others argue that journals should actively investigate serious challenges and, when warranted, issue corrections or retractions, especially in cases of fraud or gross negligence.

Data Torture, Bias, and ‘Cui Bono’

  • Multiple examples are given of “tuning” methods or matching criteria until desired results appear, especially in litigation and policy‑adjacent work.
  • Commenters stress “who benefits?” as a key heuristic: sustainability, diversity, or industry‑funded studies that always favor sponsors or fashionable causes are treated with extra skepticism.

Systemic Incentives and Possible Reforms

  • “Publish or perish” and accreditation requirements for faculty are seen as classic Goodhart’s Law: counting papers instead of judging substance.
  • Proposals include GitHub‑like open repositories with visible issue tracking, more transparent review, and broader use of post‑publication critique.
  • Brandolini’s law is invoked, with debate on whether AI/LLMs will worsen bullshit or help cheaply refute it.