Message Bus for AI Agents: HTTP + SQLite

Scale to 20 AI agents? Forget bloated queues. A humble HTTP endpoint and SQLite file keep them chatting reliably, like email for robots.

HTTP + SQLite: The Dead-Simple Bus Powering 20+ AI Agents — theAIcatchup

Key Takeaways

  • HTTP + SQLite crushes enterprise queues for AI agent comms — simple, persistent, pull-based.
  • Lessons: Prioritize no-loss over speed; rate-limit broadcasts; fallback modes for resilience.
  • This is AI's SMTP: Dumb reliability enables agent swarms without infrastructure nightmares.

Picture this: AI agents everywhere, 20-plus of ‘em, buzzing across nine servers. Everyone figured you’d need some beastly message queue—RabbitMQ thundering in, or NATS jetting payloads—to make ‘em talk. Delegating tasks, syncing states, firing off queries. Chaos without infrastructure muscle, right?

Wrong. This changes everything. A whisper-quiet message bus for AI agents, built on HTTP POSTs and SQLite pulls, runs the show. No overkill. Just elegant simplicity that screams platform shift.

What Were We All Expecting from AI Swarms?

Folks dreamed of agent armies — autonomous digital workers divvying up workloads, like a sci-fi hive mind. But the plumbing? Enterprise-grade. Pub/sub fireworks. You’d hear whispers of Kubernetes orchestras tuning Kafka symphonies. Scale demands sophistication, they said.

Then bam. This setup flips it. Agents ping a central HTTP API to send notes — text blobs, really — and poll their inboxes like checking email. Intermittent heartbeats every five minutes, cron jobs kicking in. No pushy websockets freaking out over disconnects. Pull-based. Idempotent. Forgiving.

It’s the SMTP of AI agents. Remember how email conquered the world not with bells and whistles, but dogged persistence? Messages wait in queues forever if you’re offline. That’s the killer insight here — a historical parallel nobody’s shouting about yet. SMTP didn’t need threads or emojis to wire the planet. This bus doesn’t either for AI.

How Do 20+ AI Agents Actually Talk?

Agent Joe (the manager) spots a user request. POSTs to /api/send:

curl -X POST http://bus-server:8091/api/send \ -H ‘Content-Type: application/json’ \ -H ‘X-Bus-Token: ’ \ -d ‘{“from”:”joe”,”to”:”jack”,”subject”:”Task”,”body”:”Handle blog publishing”}’

Jack polls /api/inbox/jack?mark_read=true. Grabs it. Done. Body’s a JSON nugget: from, to, subject, body, timestamp. Classic email quartet. No priorities, no tags — agents aren’t debugging Gmail.

Broadcast? to: ALL duplicates to every inbox. Heartbeats dump status summaries to the boss. Investment agent pings Learning agent: “Any quant trading notes?” Back comes “/shared/knowledge/quant/”. Boom. Collaboration without the cruft.

SQLite underneath — one file, backups a breeze, queries zippy for low volume (hundreds of messages daily). HTTP over trusted LAN, token auth. Nodes spread out? Doesn’t matter. Everyone hits the bus server.

And agents? They’re flaky sleepers — online briefly, then nap. Push models flop here. Pull wins because unread messages persist. Like voicemails, not live calls.

Why Ditch RabbitMQ for This?

RabbitMQ, Redis Pub/Sub, NATS — they’re tanks for battlefields. But AI agent chatter? Tens to hundreds of messages a day. Tiny payloads. Needs inbox persistence, simple IDs. Those tools? Overkill city. Latency obsession when reliability trumps speed.

For AI agents, message loss > latency.

That’s the quote that hits home. No persistence? Restart wipes queues — catastrophe for async tasks. SQLite locks it down.

Pull over push matches the vibe. No connection drama. Agents degrade gracefully if the bus hiccups — local mode only.

Those Gritty Production Lessons

Two months, 200 messages daily, zero loss. Stable as granite.

But hiccups taught gold. Broadcast storms — one loopy agent spamming 20 inboxes. Rate limit: 10 msgs/min per sender. Fixed.

Single point of failure? Bus down, comms freeze. Agents fallback to solo ops. Smart.

No message loss post-SQLite. That’s the win. Simplicity scales when needs are pure: reliable async notes.

Here’s the bold prediction: this pattern explodes. As AI agents swarm — think Devin-level coders teaming with analysts — lightweight buses like this underpin it all. Not monoliths. Not mega-queues. Email for the machine age.

Energy surges here. Imagine: your codebase births agents that email tasks across clusters. Platform shift? Absolutely. We’re not bolting AI onto apps anymore. Apps emerge from agent conversations.

But wait — corporate hype alert. Some’ll spin this as ‘revolutionary.’ Nah. It’s refreshingly dumb. That’s the power. Don’t overengineer the pipes when the flow’s light.

Why Does This Matter for AI Developers?

You’re building agent fleets? Ditch the bloat. Start here: Node.js server, SQLite, HTTP endpoints. Test with curls. Scale to 20 nodes. It’ll hum.

Key principles stick:

Pull for intermittency. Inbox for no-loss. Minimalism — email format suffices. Internal trust, no fancy security dance.

Wander into shared knowledge pings, task chains: User → Manager → Specialist → Back. Pure delegation delight.

This isn’t toy territory. Production-proven. Your next multi-agent rig? Bet on simple.

So yeah. AI’s platform pivot feels real when comms this frictionless. Agents talk like humans did in ‘79 — email. And it works.

**


🧬 Related Insights

Frequently Asked Questions**

What is a message bus for AI agents?

It’s a lightweight system for agents to send/receive persistent messages — tasks, queries, broadcasts — without complex queues. HTTP API + storage like SQLite does it perfectly for low-volume, intermittent agents.

How to build a message bus with HTTP and SQLite?

Spin up a Node.js server with /api/send (POST) and /api/inbox/:id (GET). Store in SQLite with sender/receiver tables. Add tokens, rate limits. Agents POST to send, poll to read. Done in hours.

Does this scale beyond 20 AI agents?

For hundreds of low-freq messages? Yes, easily — SQLite handles it, or swap to Postgres later. But if you’re blasting millions, revisit queues. Matches most agent setups today.

Aisha Patel
Written by

Former ML engineer turned writer. Covers computer vision and robotics with a practitioner perspective.

Frequently asked questions

What is a message bus for AI agents?
It's a lightweight system for agents to send/receive persistent messages — tasks, queries, broadcasts — without complex queues. HTTP API + storage like SQLite does it perfectly for low-volume, intermittent agents.
How to build a message bus with HTTP and SQLite?
Spin up a Node.js server with /api/send (POST) and /api/inbox/:id (GET). Store in SQLite with sender/receiver tables. Add tokens, rate limits. Agents POST to send, poll to read. Done in hours.
Does this scale beyond 20 AI agents?
For hundreds of low-freq messages? Yes, easily — SQLite handles it, or swap to Postgres later. But if you're blasting millions, revisit queues. Matches most agent setups today.

Worth sharing?

Get the best AI stories of the week in your inbox — no noise, no spam.

Originally reported by dev.to

Stay in the loop

The week's most important stories from theAIcatchup, delivered once a week.