Disallow = and : as bare items in [] containers, as that likely is
"[ { foo = bar } ]" mistyped as "[ foo = bar ]".
Disallow nesting errors, eg. "[ foo bar" or "[ foo bar }".
Fix handling of ", \ and # in bare strings.
Fix ignoring trailing comments.
Add a fixed-size stack (128 levels) to the tokenizer, so that it can
check these at levels below its depth.
When the tokenizer encounters an error, make it and its parents enter
error state where no further input will be processed. This allows caller
to check for parse errors later as convenient.
The error state can be queried using spa_json_get_error, which also
looks up the error line/column position.
spa_json_parse_float/int receive non nul-terminated string, so calling
string functions assuming nul-termination is invalid.
Fix by copying data to a buffer before doing parsing.
We need exactly 4 hex characters, everything else is refused. We
also copy those characters directly to the output string without
assuming any encoding.
See #2337
And use this in spa_json_format_float() where we also avoid invalid
json floats.
Use json float format in some places where we serialize json floats.
Add a unit test.
See #2223