Jaison: The Fault-Tolerant JSON Parser Built for the LLM Era
If you’ve ever asked ChatGPT, Claude, Gemini, Qwen, ERNIE, or any large language model to “return JSON,” you already know the pain: the output looks perfect to human eyes but explodes the moment you feed it to JSON.parse.
A missing bracket, a trailing comma, Chinese full-width punctuation, single quotes, // comments, “`json
Jaison is a zero-dependency, pure JavaScript JSON parser designed from the ground up to fix exactly these problems in a single pass. It silently repairs dozens of structural mistakes that LLMs love to make and hands you back a usable JavaScript object.
Why the World Needed Another JSON Parser
The official JSON specification (RFC 8259) is strict by design. JSON.parse has basically zero tolerance for deviations, which is great for machine-to-machine communication but terrible when the source is a language model (or a human typing in WeChat).
Real-world examples that instantly kill JSON.parse but are trivial for humans to understand:
{"name":"张三","age":25} // Chinese punctuation
{name: "Lucy", age: 30} // unquoted keys
{'status': 'ok'} // single quotes
[1,,2,3,] // trailing and double commas
{"flag": tru, "nil": nul} // incomplete literals
Jaison fixes all of the above — and many more — automatically.
What Jaison Can Actually Repair (with Real Examples)
| Issue | Broken Input Example | Fixed Result |
|---|---|---|
| Markdown code fences | json\n{"answer": 42}\n |
{“answer”: 42} |
| // and /* */ comments | {“name”: “John”, // temp\n “age”: 30} | {“name”: “John”, “age”: 30} |
| Single-quoted strings | {‘user’: ‘李四’, ‘city’: ‘上海’} | {“user”: “李四”, “city”: “上海”} |
| Chinese full-width punctuation | {“姓名”:”王五”,”年龄”:18} | {“姓名”: “王五”, “年龄”: 18} |
| Unquoted or numeric keys | {name: “Mary”, 123: “id”} | {“name”: “Mary”, “123”: “id”} |
| Missing closing braces/brackets | {“items”: [1, 2, 3 | {“items”: [1, 2, 3]} |
| Unterminated strings | {“msg”: “hello world | {“msg”: “hello world”} |
| Trailing/double commas | [1,,2, ,3,] | [1, null, 2, null, 3] |
| Incomplete true/false/null | {“flag”: tru, “ref”: nul} | {“flag”: true, “ref”: null} |
| Alternative null tokens | {“value”: none} or {“value”: nil} | {“value”: null} |
| Hex, octal, binary literals | {“a”: 0xFF, “b”: 0o70, “c”: 0b1010} | {“a”: 255, “b”: 56, “c”: 10} |
| Scientific notation | {“big”: 1.23e5, “small”: 4e-3} | {“big”: 123000, “small”: 0.004} |
| Text after valid JSON | {“answer”: 42} Here is some explanation\non multiple lines | {“answer”: 42} |
| Truncated streaming response | {“status”:”thinking”,”result”: | {“status”:”thinking”,”result”:null} |
| Case-insensitive constants | {“debug”: TRUE, “empty”: NULL} | {“debug”: true, “empty”: null} |
| Extremely deep nesting | {“a”:{“a”:{“a”: … (200,000 levels) | Parsed safely (memory-only limit) |
These cover 95%+ of the malformed JSON patterns produced by mainstream LLMs worldwide — especially Chinese models that love full-width punctuation.
Installation & Minimal Usage
npm install jaison
const jaison = require('jaison');
// Success → you get the object directly
const data = jaison('{"name": "Alex", "age": 28}');
console.log(data); // { name: 'Alex', age: 28 }
// When you need to handle total failures
try {
const obj = jaison(llmOutput);
// use obj safely
} catch (err) {
console.error('Even Jaison gave up:', err.message);
}
One function, no configuration, no hidden gotchas.
When to Use Jaison vs. JSON.parse
| Scenario | Recommended Parser | Reason |
|---|---|---|
| LLM output, user input, chat messages | Jaison | Unpredictable small errors are the norm |
| Chinese users (full-width :,) | Jaison only | The only parser that auto-converts Chinese punctuation |
| Streaming / possibly truncated responses | Jaison | Auto-completes missing brackets and values |
| Known-valid JSON > 1 MB | JSON.parse | ~6× faster on perfect data |
| Ultra-performance-critical path | JSON.parse | Jaison is slower on valid JSON (still milliseconds) |
| Insanely deep nesting (tens of thousands) | Jaison only | JSON.parse stack-overflows; Jaison is iterative |
Rule of thumb: If the source is a human or an LLM, start with Jaison.
Performance Numbers (550,000+ Runs)
| Test Case | JSON.parse Time | Jaison Time | Relative Speed |
|---|---|---|---|
| Small objects | 17 ms | 80 ms | ~4.7× slower |
| 3,000-key objects | 296 ms | 1,093 ms | ~3.7× slower |
| 3,000-item arrays | 1,339 ms | 10,658 ms | ~8× slower |
| 200,000 nested levels | Stack overflow | 95 ms | JSON.parse fails completely |
| Malformed cases (comments, Chinese punctuation, etc.) | Immediate error | 20–30 ms | Jaison actually faster because it succeeds |
Bottom line: Jaison trades a little speed on perfect JSON for total reliability on the messy stuff.
Frequently Asked Questions
Can Jaison extract JSON buried in the middle of normal text?
No. It expects either clean JSON or a “`json code block. You need to extract the block yourself first (a simple regex usually does it).
Does it ever change the meaning of the data?
Only in the most reasonable way. For example, 1,,2 becomes [1,null,2]. It follows the “principle of least surprise.”
Does it support BigInt?
Like native JSON.parse, numbers ending in n are treated as strings. Convert them manually after parsing if needed.
Can it handle circular references?
No — JSON itself doesn’t support them, and Jaison stays faithful to that.
Is it safe to use with untrusted input?
Yes. It never executes code. Security profile is identical to JSON.parse.
Works in the browser?
Absolutely. Zero dependencies, works everywhere modern JavaScript runs.
How to Add Jaison to Any Project in 30 Seconds
// utils/safeJson.js
const jaison = require('jaison');
function safeJsonParse(str) {
if (typeof str !== 'string') return null;
try {
return jaison(str);
} catch (e) {
console.warn('Jaison failed too:', e.message);
return null;
}
}
module.exports = safeJsonParse;
Then everywhere you receive LLM output:
const safeParse = require('./utils/safeJson');
const result = safeParse(aiResponse);
if (result !== null) {
// safe to use
}
Three lines of code, zero more “Unexpected token” nightmares.
Final Thoughts
Jaison isn’t here to replace JSON.parse — it’s here to complement it. Use the native parser when you control the data and need maximum speed. Reach for Jaison the moment the source is a language model, a chat user, or anything else from the real world.
Especially if you’re building applications for Chinese users or integrating any domestic LLM, Jaison is currently the only drop-in solution that understands full-width punctuation out of the box.
One tiny package, one function, countless saved debugging hours.
npm: https://www.npmjs.com/package/jaison
GitHub: Contributions very welcome — especially more weird malformed examples from Chinese LLMs!
Happy coding, and may your AI responses always parse on the first try.

