Meta's open AI hardware vision

Open vs. Closed AI Platforms

  • Many see Meta positioning itself as the “open” counterweight to OpenAI/Anthropic, analogous to Android vs iOS or Windows vs macOS.
  • View that Meta’s strategy is to “destroy the moat” by commoditizing models and hardware, while OpenAI/Anthropic race to build moats.
  • Some question whether Meta will stay “open” long-term, citing past platform shifts (Facebook APIs, VR platform tightening and later loosening).

Meta’s Business Model and Motives

  • Debate whether Meta intends to “sell LLMs” directly versus using them to power its own products, ads, and engagement.
  • Several commenters frame this as classic “commoditize your complement”: make models and hardware cheap/open to protect and enhance Meta’s ad and social businesses.
  • Some argue this is mainly defensive—preventing lock‑in to competitors’ closed ecosystems and avoiding existential risk.

“Open” Licensing and LLaMA Controversies

  • Strong disagreement on whether LLaMA is truly “open source.”
  • Criticisms: restrictive licenses (e.g., business size, field-of-use, mandatory “Built with Llama” branding), EU usage bans on multimodal models, and lack of training data/code.
  • Counterpoint: for practical purposes, weights are the “source” needed for modification; training pipeline openness is less essential.

Hardware Strategy and NVIDIA Dependence

  • Meta’s open rack and networking designs (OCP, DSF, MTIA) seen as a way to weaken NVIDIA’s system-level moat and enable future non‑NVIDIA options (e.g., AMD).
  • Some say this is still great news for NVIDIA in the short term; others see it as laying groundwork to reduce long‑term dependence and cost.

Economics of Large Models

  • Back-of-envelope estimates put Llama 3.1 405B training at hundreds of millions in hardware, plus ops costs.
  • Thread disputes claims about Meta’s valuation gains; some emphasize AI as a stock-price “pump,” others note broader market movement.
  • No consensus on whether anyone is yet net-profitable on LLMs; some think value is defensive and long-term (moderation, AI ads, PR, hiring).

Chips and Energy

  • Discussion of whether big players should jointly define open AI chips; most expect each to keep designing proprietary accelerators instead.
  • Expectation that future AI datacenters will colocate with large, low-emission power sources, especially nuclear, due to massive energy needs.