Perhaps some macro-ridden Rust monstrosity that spits out specialised parsers at compile time, dynamically…
[1] https://github.com/omissis/go-jsonschema [2] https://github.com/pquerna/ffjson
Schemas can't fix that.
I'll definitely agree that most things won't fully take advantage of that even if you provide that information, but it is definitely possible to do so.
That said, JSON is designed for human readability above performance, so it's a design concession that makes sense. What doesn't make sense is using JSON anywhere performance matters.
Otherwise there is no need to keep a buffer of anything after it has been parsed.
Let's assume I send you a JSON object that is one very long string and nothing else. It's e.g. 1 GB in size. To know you need to allocate a 1GB buffer, you need to first scan it, and then copy it; or keep reallocating the same buffer until it fits.
It's an absurd case, but shorter strings face similar overhead.