5 min read
Will TOON Replace JSON in the AI World?
There's a new format gunning for JSON's throne. It's called TOON - Token-Oriented Object Notation. It hasn't hit mainstream yet - but it's coming.

TOON dropped in November 2025. Within weeks, it hit 21,000 stars on GitHub. The spec is at toonformat.dev. SDKs exist for TypeScript, Python, Go, Rust, and .NET.
The promise? 40% fewer tokens. Higher accuracy. Same data.
So will it actually replace JSON? Let me break down what I learned.
The Problem TOON Solves
JSON is everywhere. APIs, configs, logs, tool calls, integrations. It's the default language of structure across software.
But in LLM land, JSON has a problem: it's expensive.
Every brace, every quote, every comma, every repeated key — the tokenizer treats each one as a token or partial token. And you're paying for all of it.
Cost. Model pricing is per token. More structure = higher bills.
Latency. More tokens = slower inference.
Context pressure. Extra tokens waste context window space.
Agent loops. Structured state gets passed repeatedly. The cost multiplies with every cycle.
What TOON Actually Looks Like
TOON's core idea: declare the schema once, then stream the values.
JSON: {"users": [{"id": 1, "name": "Alice", "role": "admin"}]} — 84 tokens.
TOON: users[1]{id,name,role}: 1,Alice,admin — 32 tokens. 62% savings on this example.
The Benchmarks
The TOON team ran comprehension tests across 4 models using 209 data retrieval questions.
Results: TOON hit 73.9% accuracy at 2,744 tokens. JSON hit 69.7% accuracy at 4,545 tokens. TOON used 40% fewer tokens and achieved higher accuracy.
The sweet spot is uniform arrays of objects — the kind of tabular data you'd put in a spreadsheet. Customer lists. Transaction logs. Product catalogs. Agent state objects.
So Will TOON Replace JSON?
Short answer: No, not globally. Yes, locally — in AI workflows.
JSON has massive momentum. Universal parsers. Stable RFC standards. Schema tooling. Validation tools. Replacing JSON globally would mean changing APIs, databases, event pipelines, partner integrations, and decades of tooling.
What actually happens in tech: standards don't get replaced. They get layered on top of.
TOON won't kill JSON. It'll become a specialized format used where JSON is inefficient — mainly the prompt layer.
The Clean Architecture
Store everything as JSON. Convert JSON → TOON only at the LLM boundary. Convert back if needed.
Think of TOON as gzip for structured prompts — not the new HTTP.
When to Use What
Use TOON when: your prompts are heavy on uniform structured data; your agent loop is expensive; you're hitting context limits; you want stable structure without JSON token overhead.
Don't use TOON when: data is deeply nested or non-uniform; interoperability is the primary objective; you need battle-tested tooling.
The Bigger Picture
LLMs don't just change what we build — they change how we represent information.
TOON is one of the first serious attempts at creating AI-native structured notation. It won't be the last. As AI systems scale, expect more formats designed for how models actually consume information — not how humans historically organized it.
Resources: Spec & SDK at toonformat.dev. GitHub: github.com/toon-format/toon.
Related reading
Stop Wasting Tokens. Here's the Context Strategy That Actually Scales — The broader token efficiency picture that TOON fits into.
Stop Dumping Tools Into Context. It Doesn't Scale — Context bloat from a different angle: tool definitions instead of data formats.
PageIndex vs Vector RAG — Another case of questioning the default format assumption to get better results.