Open source devs consider making hogs pay for every Git pull

Open source devs consider making hogs pay for every Git pull

Summary

The Register reports that maintainers of major open source package registries (notably Maven Central) are confronting massive, concentrated demand that treats repositories like free CDNs. Sonatype’s CTO Brian Fox and other ecosystem leaders have found that a tiny fraction of IPs — under 1% — account for roughly 82% of downloads. That usage pattern, amplified by CI/CD pipelines, security scanners and AI-driven tooling, is causing unsustainable bandwidth, storage and operational costs. Registries are exploring mandatory tiered access models that keep downloads free for hobbyists while charging high-volume commercial consumers.

Key Points

  • Major registries handled an estimated 10 trillion downloads last year, creating huge infrastructure strain.
  • About 82% of traffic comes from less than 1% of IP addresses, with hyperscalers responsible for ~80% of that load.
  • Many organisations effectively use public registries as CDNs, repeatedly downloading the same artefacts at commercial scale.
  • Attempts to throttle traffic produced ‘Whack-a-Mole’ effects as consumption patterns adapt (429 errors, brownouts).
  • Registries and the OpenSSF have called for mandatory tiered access models to keep services free for individuals and small projects while charging commercial-scale users.
  • Solutions recommended: better caching, use of internal proxies/repositories, avoid per-commit tests and audit automated tooling that causes repeated pulls.

Content summary

At a Linux Foundation summit Sonatype’s CTO highlighted how a handful of users and cloud providers are consuming the bulk of open source registry bandwidth. Examples include organisations downloading identical sets of components millions of times per month and misconfigured builds bypassing internal caches. The registries have therefore proposed tiered, mandatory access models so hobbyists and small teams remain free, while high-volume commercial users contribute to the running costs. The change is framed as addressing a ‘tragedy of the commons’ aggravated by modern automation and AI workflows.

The article quotes ecosystem figures and emphasises non-malicious causes — ignorance or lack of caching — and urges organisations to inspect their bills and caching strategy. It also flags that registries lack funds to implement necessary security and compliance improvements, and that regulatory demands (for example the EU Cyber Resilience Act) will further raise costs.

Context and relevance

This matters if your organisation relies on public package registries for builds, CI pipelines or AI-driven code generation. Expect policy and billing changes that could affect build times, supply-chain practices and cloud egress costs. The story links to broader trends: increasing automation in development, sharper focus on software supply-chain security, and moves by open source foundations to make stewardship sustainable.

Why should I read this?

Because if your CI/CD or AI tooling is blasting registries every few minutes, you might soon get throttled — or charged. This is a practical wake-up call: sort your caching, stop treating public registries like an infinite CDN, and budget for registry costs before someone else does it for you. We’ve done the skimming so you don’t have to — but you should act fast.

Source

Source: https://go.theregister.com/feed/www.theregister.com/2026/02/28/open_source_opinion/