Studio Ghibli, Bandai Namco, Square Enix Demand OpenAI to Stop Using Their IP
Anti-piracy analogy & data harvesting
- Many compare AI training on copyrighted works to classic piracy: “downloading content for AI training is stealing.”
- Others argue the “you wouldn’t steal a DVD/car” analogy is weak because digital copies have zero marginal cost and harm is indirect or market-dependent.
- Some highlight the irony that past anti-piracy campaigns themselves used infringing material, underscoring the complexity and hypocrisy around IP.
Ads, attention, and what counts as “payment”
- One side claims pervasive advertising “steals” time, attention, mental health, and device resources.
- Counterargument: viewing ads is a voluntary payment for a service; you can refuse by not using the service or by paying directly.
- Tension appears when companies call ad-blocking “theft” while asserting ads are a fair exchange.
Transformative use, scale, and AI vs humans
- Broad agreement that AI pushes the limits of “transformative use” doctrines: the law never anticipated systems that ingest everything and output in any style at scale.
- Some insist embedding works in vector spaces is not meaningfully transformative; others say we don’t fully understand human creativity either, so process-based distinctions may be shaky.
- A recurring theme: scale and automation change the ethical and legal calculus even if AI “learned” similarly to humans.
Style, copyright, and legality
- Several comments stress that styles are generally not copyrightable; specific characters, plots, and compositions are.
- Disagreement over whether painting in “Ghibli style” is infringement or simply fair use / non-actionable inspiration, especially for non-commercial personal work.
- Others argue that when a commercial product (e.g., OpenAI) systematically enables Ghibli-like output and sells access, it crosses into direct competition and likely infringement.
Artist livelihoods and cultural impact
- Strong concern that AI undermines artists’ ability to earn a living by cheaply cloning styles built over lifetimes.
- Some say this is akin to “corporate piracy” or exploitation; others counter that art has always been copied and that business models—not art itself—must adapt.
- A few take a hard line: many artists may have to “get a stronger business” or leave the profession; others warn that losing working artists degrades culture, critical thinking, and “human” entertainment.
Enforcement, jurisdictions, and future models
- Debate over whether model training is currently illegal; some say it’s clearly willful commercial infringement, others assert training is lawful but outputs may infringe.
- Non-US perspectives note that many countries lack broad fair-use concepts; examples from Japan suggest that even using Ghibli-like AI images commercially could trigger counterfeiting laws.
- Some expect outcomes analogous to Napster (banned) vs YouTube (licensed); others predict large payouts, “firewalls” around national IP, or robots.txt-style opt-outs becoming mandatory.
Fairness, double standards, and what the law ought to be
- Several point out a double standard: individuals happily pirate but condemn OpenAI; big tech invokes “fair use” aggressively while defending its own IP.
- Others emphasize that beyond what’s currently legal, society must decide whether it’s fair for a few companies to appropriate “the treasure of humanity” without consent or attribution.
- There’s no consensus: views range from “end fair-use harvesting” to “I hate copyright more than I hate AI companies,” with many admitting they are genuinely torn.