When Developers Started Fighting Over Curly Braces


Someone on Reddit invented a new data format just to prove another format was terrible. That's where we are in 2025. Developers are creating entire serialization formats out of spite.
It started in October when a developer posted about TOON (Token-Oriented Object Notation). The pitch was simple. Use this instead of JSON and save 30-60% on your LLM token costs. Within weeks, the format spread across dev.to, GitHub, and LinkedIn. Everyone was sharing it. Everyone had an opinion.
Then someone ran the actual numbers and called it misleading. Another developer got so annoyed they built TRON (Token Object Notation) just to make a point. The comments section turned into chaos. One user joked about creating GOON (Good Object Oriented Notation). Another said we should just use HTML since that's what trained the models anyway.
You probably saw this happening. Maybe you tried it. Maybe you rolled your eyes and kept using JSON. Either way, something shifted. People realized their API bills were climbing and started looking at the curly braces.
What TOON Actually Does
The idea is embarrassingly simple. JSON repeats field names for every object in an array. TOON declares them once at the top.
Here's the thing. When you send 180 days of analytics data to an LLM, JSON uses 10,977 tokens. TOON uses 4,507. That's 59% fewer tokens. For a GitHub repo payload with 15K entries, you go from 15,145 tokens down to 8,745. The savings are real.
But most comparisons cheat.
They pit TOON against pretty-printed JSON with all the whitespace and indentation. That's not fair. Minified JSON strips all that out. One developer tested this properly. Pretty-printed JSON used 101 tokens. YAML used 133. Minified JSON used 41. Suddenly YAML looks worse than both.
And TOON has the same problem. It works great for flat arrays of objects. Think database exports or API responses with consistent schemas. But real-world data is messy. Nested objects kill TOON's efficiency. The GitHub MCP tools payload is a perfect example. TOON loses to JSON when things get nested.
The Format Wars Nobody Asked For
Here's what happened next. Someone got frustrated enough to build TRON. They spent hours on it. Built a website. Wrote a JavaScript SDK. Then posted it to Reddit with a disclaimer that they probably wasted their time.
The thread exploded. People pointed out that none of these formats matter because LLMs weren't trained on them. Others said it's pointless to invent formats the models never saw during training. XML and HTML make more sense because they're everywhere in the training data.
But some developers defended it. They said if you're making hundreds of API calls with structured data, even small savings add up. When your bill is $500 a month and you can cut it to $300, you don't care about elegance.
The YAML Detour
Before TOON showed up, people were already arguing about YAML versus JSON. A Reddit post in September claimed YAML saves 2x tokens. It got 221 upvotes. Everyone started converting their prompts.
Then someone tested it properly and found YAML uses more tokens than minified JSON. The whole thing was based on a bad comparison. YAML needs whitespace and indentation. JSON doesn't. You can strip JSON down to nothing.
i tried this myself once. Converted a big config file to YAML thinking it would help. The model got confused. It was trained on way more JSON than YAML. Accuracy dropped. Went back to JSON and just minified it.
Why This Even Matters
The real issue isn't TOON or JSON or YAML. It's that token costs are high enough that developers care about punctuation.
Every comma in JSON counts as a token. Every brace. Every quote. When you're sending large payloads repeatedly, those add up fast. Some workflows use the same structured data in thousands of API calls. At that scale, format matters.
And there's an accuracy angle too. Multiple benchmarks showed TOON getting 86.6% accuracy versus JSON's 83.2% on data retrieval queries. That's unexpected. The theory is that fewer tokens mean less noise for the model to parse. Simpler structure leads to better comprehension.
But other tests showed JSON working better for smaller models. They're trained on more JSON than anything else. Familiar formats help.
When Formats Actually Work
TOON shines with uniform data. Analytics dashboards. Database exports. API responses where every object has the same fields. Use it there and you'll see real savings.
For everything else, just minify your JSON. Remove the whitespace. Strip the indentation. You'll get most of the benefits without learning a new format.
And if your data is deeply nested or inconsistent, none of this helps. Stick with what works.
The Side Quest About Naming Things
i still think TRON is a better name than TOON. TOON sounds like a cartoon format. TRON sounds like you're about to enter a digital world and fight programs.
The best part is someone actually commented "GOON" as a joke. Good Object Oriented Notation. It's perfect. It sounds ridiculous. But so does inventing new formats every month because we're all trying to save $50 on API bills.
This reminds me of the endless naming debates in open source. Kubernetes could have been called anything. They went with a Greek word for helmsman. Docker is just a dock worker. These names stick because they're different. TOON might stick just because it's weird enough to remember.
The Honest Reality
Most people don't need this. If you're making a few API calls a day, format doesn't matter. The token difference is negligible.
But if you're building agentic workflows or processing large datasets through LLMs repeatedly, it's worth testing. Run your actual data through different formats. Measure the token count. See if accuracy changes.
Don't assume TOON is better because someone said so. Don't assume YAML saves tokens because it looks cleaner. Test it yourself with your data and your model.
And definitely don't invent a new format out of spite unless you're ready to maintain it.
Where This Ends Up
The TOON repo on GitHub is still active. People are building implementations in Python, Rust, and Go. Some companies are using it in production for specific use cases.
But the controversy taught everyone something. Comparisons are misleading when they're not apples to apples. Minified JSON beats most alternatives. And models perform best on formats they've seen before.
The real winners are the developers who tested everything and picked what worked for their specific case. Not what was hyped. Not what saved the most tokens on a benchmark. What actually reduced their bill and kept their accuracy high.
i still use JSON for almost everything. Minified when it matters. And i think about that Reddit thread sometimes. All those hours spent building TRON. The passionate arguments in the comments. The jokes about GOON. That's just what developers do when costs get high enough. We optimize everything. Even the punctuation.
Enjoyed this article? Check out more posts.
View All Posts