Markdoc for LLM Streaming UI: Build Better Chatbots

Picture this: Your AI assistant streams a response, and boom — interactive charts and buttons appear mid-sentence, no buffering required. mdocUI makes it real with Markdoc's clever tags.

Markdoc Finally Cracks Open Real-Time Interactive AI UIs — theAIcatchup

Key Takeaways

  • mdocUI uses Markdoc tags to enable true streaming of markdown + interactive UI components without buffering.
  • Solves key pains like invalid streaming JSON or confused LLM JSX output with a battle-tested syntax.
  • Bold prediction: This syntax becomes the standard for generative UIs, like HTML did for the web.

Imagine you’re debugging code late at night, firing questions at your AI sidekick. It doesn’t just spit back walls of text. No. Charts bloom into existence as it types. Buttons beg for clicks. Tables sort themselves on demand. That’s the future mdocUI delivers — right now — turning static LLM chats into living, breathing interfaces.

And it’s not some distant promise. This hits real people hard: developers wasting hours hacking parsers, users staring at loading spinners while JSON blobs trickle in incomplete. mdocUI, built on Markdoc for LLM streaming UI, sweeps that mess away.

Look. Every chatbot builder knows the pain. LLMs love markdown — crisp headings, code blocks that shine. Then comes the curveball: “Show me a chart.” Suddenly, you’re knee-deep in hacks.

Three properties make this tag syntax ideal for LLM streaming: Unambiguous delimiter — the opening sequence is something you would never expect in normal prose, standard markdown, or fenced code blocks. A streaming parser can detect it without lookahead or backtracking.

That’s the creator’s words, straight fire. Markdoc, Stripe’s doc wizardry, steps in with tags like [% chart type=”bar” /%]. LLMs already grok it from training data. No hand-holding prompts needed.

But here’s my twist — the one nobody’s shouting yet. This echoes the browser wars of the ’90s, when tables and frames turned static HTML into dynamic pages. Markdoc tags? They’re the HTML of generative UIs. Predict this: In two years, every major AI framework will fork this syntax. It’s too elegant to ignore.

Why Markdoc Eats JSON and JSX for Breakfast

JSON in code blocks? Cute until streaming mangles it. You buffer everything — kiss goodbye to that snappy feel — or chase half-baked objects. Fragile as glass.

JSX sprinkled in markdown? LLMs hallucinate props, forget slashes, litter < everywhere. Parsing turns nightmarish; one rogue angle bracket and poof, chaos.

Custom DSLs? You’re the weirdo inventing [[chart:bar:data]] syntax, force-feeding it to models, burning precious tokens on tutorials. Plus, your bespoke parser? Maintenance hell.

Markdoc sidesteps all that. Those [% %] delimiters? Unmistakable. Streaming parser grabs ‘em token-by-token, no buffering drama. Prose flows uninterrupted, components slot right in.

It’s like giving LLMs Lego bricks mid-story. They snap charts after a sentence, buttons before the wrap-up. Natural. Fluid. Magical.

And the architecture? Brutally simple: Tokens feed a state-machine tokenizer (prose, tag, string modes), then streaming parser builds an AST. Zod schemas vet props — invalid? Graceful error boundaries, not app-killing explosions. Renderer swaps to your React (or soon Vue/Svelte) components, all theme-agnostic with currentColor magic.

How Does mdocUI Actually Work in Code?

Grab it: pnpm add @mdocui/core @mdocui/react. Whip up a registry, auto-gen a system prompt that teaches your LLM the tags. Boom.

In your chat hook:

const { nodes, isStreaming, push, done } = useRenderer({ registry })

Push tokens as they stream. Watch markdown + 24 components (charts, tables, forms, accordions, you name it) render live. Handles actions out of the box.

Playground’s a revelation: https://mdocui.vercel.app. Type a prompt, see revenue bars rise mid-response, buttons pulse ready. It’s the wonder I felt first booting Windows 95 — interfaces coming alive.

Skeptical? Alpha 0.6.x, sure. API’s settling. Roadmap screams ambition: Vercel AI SDK hooks, devtools for AST peeking, VS Code syntax love. PRs welcome on GitHub.

But don’t sleep. This isn’t hype — it’s the platform shift. AI chats evolve from text dumps to app-like experiences. Your next SaaS agent? It’ll wield these tags like a pro.

Here’s the bold call: mdocUI pioneers what I’ll dub “Streamdoc” era. Markdown’s child, UI’s parent. Companies ignoring it? They’ll rebuild chat UIs from scratch later, cursing missed shots.

Will mdocUI Replace Your Chatbot Stack?

Not overnight. But integrate it, and watch engagement soar. No more “text-only” limitations. Users interact deeper — sort tables, submit forms, drill charts — all streamed.

For solo devs, it’s plug-and-play joy. Teams? Customize the registry, own the components. Open source heartbeat ensures it grows.

Think bigger. Customer support bots with live forms. Analytics dashboards on-demand. Educational tools where concepts visualize as you learn. Real people — not just coders — win.


🧬 Related Insights

Frequently Asked Questions

What is mdocUI and how does it use Markdoc?

mdocUI is a generative UI library for streaming LLMs. It borrows Markdoc’s tag syntax ([% tag /%]) so models output markdown mixed with interactive components like charts and buttons, parsed live without buffering.

Is mdocUI ready for production use?

It’s alpha (0.6.x) but stabilizing fast. Includes 24 components, Zod validation, error boundaries. Great for prototypes; roadmap covers major frameworks soon.

Why choose Markdoc over JSON for LLM streaming UI?

Markdoc tags stream perfectly — unambiguous delimiters, no invalid partial objects. LLMs know it natively, no custom training needed. Way better than buffering JSON or parsing messy JSX.

Aisha Patel
Written by

Former ML engineer turned writer. Covers computer vision and robotics with a practitioner perspective.

Frequently asked questions

What is mdocUI and how does it use Markdoc?
mdocUI is a generative UI library for streaming LLMs. It borrows Markdoc's tag syntax ([% tag /%]) so models output markdown mixed with interactive components like charts and buttons, parsed live without buffering.
Is mdocUI ready for production use?
It's alpha (0.6.x) but stabilizing fast. Includes 24 components, Zod validation, error boundaries. Great for prototypes; roadmap covers major frameworks soon.
Why choose Markdoc over JSON for LLM streaming UI?
Markdoc tags stream perfectly — unambiguous delimiters, no invalid partial objects. LLMs know it natively, no custom training needed. Way better than buffering JSON or parsing messy JSX.

Worth sharing?

Get the best AI stories of the week in your inbox — no noise, no spam.

Originally reported by dev.to

Stay in the loop

The week's most important stories from theAIcatchup, delivered once a week.