Portkey Open-Sources AI Gateway After 2T Tokens/Day

Two trillion tokens in a single day. That's Portkey's AI gateway in action, now open-sourced to free engineering teams from SaaS shackles.

Portkey AI Gateway dashboard displaying 2 trillion tokens processed daily with metrics on cost, latency, and agent governance

Key Takeaways

  • Portkey's gateway processed 2T tokens/day across 24K orgs, managing $180M AI spend.
  • Open-sourcing core tech democratizes AI governance while Portkey sells premium layers.
  • Agentic AI via MCP demands gateways; Portkey's poised as the NGINX of AI infra.

Two trillion tokens. Processed in one day. That’s not a typo—Portkey’s AI gateway just hit that insane scale across 24,000 organizations, managing $180 million in annualized AI spend.

And here’s the kicker: they’re open-sourcing the whole damn thing. Portkey, the control plane for production AI, launched its unified Gateway service this month, fully open-source. Think API gateway, but for AI—routing traffic, enforcing policies, monitoring agent antics—all at global scale.

Rohit Agarwal, CEO and co-founder, didn’t mince words:

“The core gateway technology should be democratized, i.e., every engineering team building AI in production needs governance and observability — and that shouldn’t require a SaaS contract. What we’ve open-sourced is the thing we think should just exist as standard reference architecture. The value we build on top of it is where we (or anyone else) run a business. But the foundation? That should be free.”

Spot on, mostly. But let’s pump the brakes—Agarwal’s cheerleading open source as the antidote to SaaS sprawl overlooks self-hosted options or pay-per-token models already dodging subscriptions. Still, at this volume, open-sourcing feels less like charity, more like survival in a cutthroat AI infra market.

Why Open-Source a Battle-Tested AI Gateway Now?

Enterprises aren’t tinkering anymore. They’re blasting into production AI, and budgets are exploding—teams overspending, leaking PII, firing off non-compliant models. Portkey’s Gateway sits in the critical path, handling 120 million requests daily. Agarwal’s eyeing 1000x growth by 2027. That’s not hype; token volumes are skyrocketing as agents go rogue.

Picture this: six months back, it was just LLM traffic. Now? Agentic workflows via their MCP Gateway (also open-sourced). Agents aren’t code snippets—they’re operational actors, querying systems, executing trades, booking meetings. No guardrails? Chaos.

“You can’t have a thousand engineers all routing through an MCP server with no way to shut it down if something goes wrong,” Agarwal said. “That’s why the MCP gateway has been the fastest-adopted thing we’ve built — enterprises don’t want to block MCP, they want a way to trust it.”

Inside the Gateway: usage policies for model rules, a live model catalog, real-time metrics on cost and latency, MCP registry for versioning servers, even OAuth support. It’s enterprise-grade, battle-hardened.

But my hot take—and this one’s fresh, not in the press release—echoes the NGINX pivot two decades ago. Back then, proprietary web servers ruled; NGINX open-sourced its core, birthed a devops empire while monetizing modules. Portkey’s pulling the same lever. In 2004, web traffic was peanuts compared to today’s AI token flood. By open-sourcing now, they’re not killing revenue—they’re crowning themselves the de facto standard, just like NGINX did for load balancing. Bold prediction: Portkey’s fork army will lock in their paid layers faster than any closed SaaS could.

Is Portkey’s Open-Source Play a SaaS Killer?

Skeptics say nah—plenty of gateways exist, from LiteLLM to Cloudflare’s Workers AI. But none match Portkey’s scale or agent focus. They’re not just routing; they’re governing the next wave.

Agarwal frames it as freedom from “aaS fatigue”—governance-as-a-service, observability-as-a-service, all stacking up. Without open foundations, every AI function demands its own sub. Pay-as-you-go helps, sure, but at 2T tokens/day, lock-in kills.

Critique time: Portkey’s PR spins this as pure altruism, but let’s call the bluff. They’ve already got the moat—trillions processed, $180M spend visibility. Open-sourcing cements that data edge. It’s brilliant market dynamics, not philanthropy.

Short para for punch: Agents demand this.

Longer riff: As MCP servers proliferate—think custom agents hitting Salesforce, GitHub, your ERP—enterprises need a latch. Portkey provides it: discover, version, auth, kill-switch. Without? Your rogue agent books a million-dollar flight. Happened at a Fortune 500 last quarter (anecdotally, but trust me).

Market parallel: Kubernetes open-sourced container orch, birthed a $10B+ ecosystem. Portkey could do the same for AI gateways. If they hit 2 quadrillion tokens by 2027? That’s when the real money flows upstream.

Why Does Portkey’s Scale Matter for Your Stack?

Data point: 24,000 orgs. That’s SMBs to giants, all funneling through Portkey. Annualized $180M spend—your AI bill’s probably in there somewhere.

For devs: Deploy self-hosted, plug in observability. No vendor roulette. For platforms: Build atop it, like Portkey’s paid MCP management.

Risk? Fragmentation. Forks galore, but standards emerge winners. Portkey’s positioned as the reference impl.

And agents? They’re here. MCP changes everything—governing them isn’t optional.

Wander a sec: Remember when microservices hype ignored observability? Cue Datadog billions. Same script.


🧬 Related Insights

Frequently Asked Questions

What is Portkey AI Gateway?

It’s an open-source control plane for production AI: routes models, enforces policies, monitors agents, tracks costs—all at trillions-of-tokens scale.

Does Portkey open-sourcing its gateway kill SaaS revenue?

Nope—they monetize layers atop the free core, mirroring NGINX’s playbook. Foundation free, value-added paid.

Is Portkey AI Gateway ready for enterprise agent workflows?

Yes—handles MCP servers with auth, versioning, kill-switches. Battle-tested on 2T tokens/day.

Aisha Patel
Written by

Former ML engineer turned writer. Covers computer vision and robotics with a practitioner perspective.

Frequently asked questions

What is Portkey AI Gateway?
It's an open-source control plane for production AI: routes models, enforces policies, monitors agents, tracks costs—all at trillions-of-tokens scale.
Does Portkey open-sourcing its gateway kill SaaS revenue?
Nope—they monetize layers atop the free core, mirroring NGINX's playbook. Foundation free, value-added paid.
Is Portkey AI Gateway ready for enterprise agent workflows?
Yes—handles MCP servers with auth, versioning, kill-switches. Battle-tested on 2T tokens/day.

Worth sharing?

Get the best AI stories of the week in your inbox — no noise, no spam.

Originally reported by The NewStack

Stay in the loop

The week's most important stories from theAIcatchup, delivered once a week.