News

GitHub’s AI Agent Tsunami: 275 Million Commits a Week, 14 Billion Projected for 2026 — And the Platform Is Starting to Crack

|Author: Viacheslav Vasipenok|5 min read| 12
GitHub’s AI Agent Tsunami: 275 Million Commits a Week, 14 Billion Projected for 2026 — And the Platform Is Starting to Crack

GitHub just hit numbers that would have sounded like science fiction twelve months ago.

GitHub’s AI Agent Tsunami: 275 Million Commits a Week, 14 Billion Projected for 2026 — And the Platform Is Starting to CrackIn all of 2025, the platform celebrated its first-ever 1 billion commits — a milestone that had employees popping champagne. Fast-forward to April 2026: GitHub is now processing 275 million commits per week. At the current pace, 2026 is on track for roughly 14 billion commits — a 14× explosion in a single year.

GitHub COO Kyle Daigle put it plainly in a recent post:

> “There were 1 billion commits in 2025. Now, it’s 275 million per week, on pace for 14 billion this year if growth remains linear (spoiler: it won’t).”

The same surge is hammering GitHub Actions: weekly compute minutes have jumped from 500 million in 2023 to 1 billion in 2025 — and already hit 2.1 billion minutes in a single week this year.

Even more telling is the source of the flood: AI agents.

According to The Information, the number of pull requests opened by AI agents surged from roughly 4 million in September 2025 to more than 17 million in March 2026 — more than a 4× increase in six months. Agents aren’t just generating code in isolation; they’re cloning repos, pushing branches, opening PRs, running CI/CD pipelines, and iterating at machine speed.


This Was Always the Destination

GitHub’s AI Agent Tsunami: 275 Million Commits a Week, 14 Billion Projected for 2026 — And the Platform Is Starting to CrackGitHub has spent the last decade becoming the de facto infrastructure layer of software development. Every serious project lives there. The CLI, the API, the web interface — they were all built so that *anything* could interact with git at scale.

Now “anything” includes thousands of autonomous AI agents.

These agents don’t browse GitHub like humans. They hammer it like scripts: `git clone`, `git commit`, `git push`, `gh pr create` — all day, every day, across thousands of repositories simultaneously. They don’t need to read the UI. They just need the API and the CLI.

And here’s the catch: most of them pay nothing.

Free accounts, open-source repositories, and generous public API limits were designed for human developers and occasional CI bots — not for fleets of AI agents that can generate more code in an hour than an entire engineering team in a week.


The Inevitable Strain — and the Outages

The result? GitHub has been visibly struggling. The same The Information piece that broke the numbers also reported repeated outages and performance degradation directly tied to the AI-driven traffic spike. Servers that were sized for human-scale usage are now being asked to handle agent-scale usage.

Daigle’s team is “pushing incredibly hard on more CPUs, scaling services, and strengthening GitHub’s core features,” but the growth curve is so steep that capacity planning has turned into crisis response.


So When Do the Limits and Paid Agent Tiers Arrive?

This is the question every GitHub watcher (and every AI startup) is asking right now.

GitHub already has Copilot Enterprise, GitHub Advanced Security, and higher-tier API limits for large organizations. But those were built for *human* teams, not for autonomous agent swarms that can burn through millions of API calls and Actions minutes without a human ever logging in.

GitHub’s AI Agent Tsunami: 275 Million Commits a Week, 14 Billion Projected for 2026 — And the Platform Is Starting to CrackPossible scenarios we’re likely to see in the next 6–12 months:

  • Agent-specific rate limits — separate, much stricter quotas for traffic that exhibits “non-human” patterns (e.g., commit bursts with no browser sessions, identical PR templates, zero review comments).
  • Dedicated AI Agent plans — paid tiers for organizations and AI platforms that want unlimited or high-limit agent access, similar to how cloud providers sell GPU fleets.
  • Usage-based billing for Actions and API — moving more workloads from “included in plan” to metered pricing when the consumer is clearly an agent.
  • Enterprise “Agent Hosting” — GitHub offering to run your agents closer to the metal (or even inside their own infrastructure) for a premium, solving both latency and rate-limit problems.

Microsoft has every incentive to keep GitHub healthy and open — it’s the single biggest moat in the entire AI coding ecosystem. But “open and free” and “14 billion commits a year” are rapidly becoming incompatible when most of those commits come from unpaid agents.

Also read:


The New Reality

GitHub’s AI Agent Tsunami: 275 Million Commits a Week, 14 Billion Projected for 2026 — And the Platform Is Starting to CrackGitHub isn’t just a code host anymore. It has become the central nervous system of AI-driven software development. Every major AI coding tool — Cursor, Claude Code, Devin, OpenDevin, Windsurf, and dozens more — routes its output straight into GitHub.

The platform that was built for millions of human developers is now being stress-tested by billions of agent actions. The growth is exhilarating. The outages are frustrating. And the business-model questions are now unavoidable.

Kyle Daigle’s “spoiler: it won’t” comment about linear growth is telling. The curve isn’t going to flatten — it’s going to keep bending upward as more capable agents ship every month.

GitHub has successfully turned itself into the rails on which the entire AI software industry runs.

Now the question is simple: how do you charge for the train when the passengers are robots?

Share:
0