Let us git rid of it, angry GitHub users say of forced Copilot features
Alternatives & Centralization Concerns
- Multiple commenters say they’re moving or donating to Codeberg, Forgejo, or self‑hosted GitLab; some note GitLab is also pushing AI and expect eventual community forks.
- Debate over whether GitHub is “critical infrastructure” or just a fancy git server with PRs. Some say outages kneecap companies and act as a CDN; others argue that’s bad engineering practice, not inherent criticality.
- Strong regret that so much FOSS landed on a proprietary, VC‑funded platform, making the community hostage to a corporate owner; others reply that convenience and network effects made this outcome predictable.
- GitHub stars, free CI (including macOS/Windows), and packages are seen as major lock‑in mechanisms beyond pure git hosting.
Reality of Copilot PR/Issue Spam
- Several maintainers of popular projects report seeing zero Copilot‑authored PRs or issues; they suspect the scale of the problem is overstated.
- Clarification: Copilot does not automatically open PRs/issues; a human has to trigger it. The main GitHub discussion is about blocking the
copilotbot account, not banning all AI‑authored content. - Others worry about LLM‑generated “sludge” from any tool (ChatGPT, Claude, etc.), especially around events like Hacktoberfest or bounty programs.
Forced AI Features & User Hostility
- Strong frustration with Copilot being surfaced everywhere: GitHub UI, VS Code, Visual Studio, Office 365, and other products. Many describe it as “forced” or dark‑patterned, with limited or hidden off‑switches.
- Some report Copilot review comments blocking automerge for trivial remarks, and accounts shown as “enabled” for Copilot even when settings say otherwise; GitHub support is described as evasive.
- Comparison to other “enshittified” products (Google Docs, GCP console) where core quality stagnates while AI buttons proliferate.
Metrics, Hype, and Business Incentives
- Skepticism about claims like “20M Copilot users” when access is auto‑provisioned or mandated by management, often unused.
- Many see the AI push as driven by KPIs, investor expectations, and ecosystem self‑interest (e.g., GPU vendors), not organic developer demand.
- Parallels drawn to crypto and self‑driving hype cycles and to the McNamara fallacy: chasing engagement numbers while ignoring user experience.
Usefulness vs. Cost of LLMs
- Some developers report substantial productivity gains for prototyping in unfamiliar languages, exploratory scripts, or navigating large new codebases.
- Others find LLMs useful mainly as fuzzy search / brainstorming tools, with limited or negative net productivity once review and corrections are included.
- Environmental and infrastructure costs are raised; critics argue the benefits don’t yet justify the scale or the aggressive rollout.
Control, Policy, and Mitigations
- Workarounds mentioned: hiding AI features in VS Code (
Chat: Hide AI Features), Org‑level Copilot disable in GitHub, Visual Studio “hide Copilot” option, and uBlock filters to block Copilot commit‑message generation. - Proposals include blocklists for AI‑slop contributors and allowing maintainers to block the
copilotbot like any other user.
Corporate Behavior & Regulation
- Long thread on why Microsoft was allowed to buy GitHub, whether it was already “critical” at acquisition, and the role of antitrust (compared to Adobe/Figma).
- Some argue corporations are doing exactly what they’re designed to do—maximize profit—and that only regulation and better initial choices (FOSS forges) could have prevented this dynamic.