Nearly a third of social media research has undisclosed ties to industry
Industry Ties in Social Media Research
- Many commenters say the findings are unsurprising and mirror patterns in tobacco, fossil fuels, pharma, food, AI, and crypto.
- Some argue industry funding is almost inevitable: they hold the data, infrastructure, and money; without them, little large‑scale research would be possible.
- Others stress this creates serious worries for policy-making, since there’s a built‑in incentive not to anger funders.
- A minority questions the study’s methodology, especially counting prior co-authorship with an industry employee as a “tie” that must be disclosed.
Trust, Disclosure, and Independence
- Several express deep distrust of both industry and academia, seeing universities as “reputation laundering” for corporate interests.
- Others emphasize that undisclosed ties call for closer scrutiny of findings, but don’t automatically invalidate results.
- A coalition for independent tech research is mentioned as an attempt to counterbalance corporate influence.
Ethics and Regulation of Corporate “Research”
- Strong concern that social media companies can run large‑scale behavioral experiments without independent ethics review, unlike academic researchers.
- Debate over what “research” actually is:
- One side argues A/B tests and emotion‑manipulation studies clearly qualify and should face oversight.
- The other side warns that over‑regulation would block useful analysis (including detecting harms) and notes that everyday business experimentation resembles research.
- Some see academic ethics boards as overbearing but necessary given past abuses.
Social Media as a Grand Experiment
- Many frame social media as a massive, poorly regulated experiment in connecting everyone and optimizing for engagement, outrage, and emotionalism.
- Comparisons to leaded gasoline and big tobacco: slow, large‑scale harm whose full cost may only be clear decades later.
- Extended discussion of algorithmic feeds:
- Critics: they amplify rage, create echo chambers, normalize extremism, and differ from earlier forums by making toxicity the default rather than an opt‑in subculture.
- Others note toxicity long predates algorithms (Usenet, forums, cable news, yellow journalism); algorithms mainly scale and automate it.
Coping and Policy Ideas
- Personal responses: delete apps, add “friction,” use non‑algorithmic tools (newsletters, chronological feeds).
- Policy suggestions: stronger disclosure norms, data transparency, limits on platform data ownership, structural reforms to reduce monopoly power, and rethinking the balance between free speech, Section 230, and algorithmic editorial control.