--- JSON-XS/README 2008/06/03 06:43:45 1.26 +++ JSON-XS/README 2010/03/11 17:36:09 1.34 @@ -22,8 +22,8 @@ # Note that JSON version 2.0 and above will automatically use JSON::XS # if available, at virtually no speed overhead either, so you should # be able to just: - - use JSON; + + use JSON; # and do the same things, except that you have a pure-perl fallback now. @@ -46,8 +46,6 @@ cases their maintainers are unresponsive, gone missing, or not listening to bug reports for other reasons. - See COMPARISON, below, for a comparison to some other JSON modules. - See MAPPING, below, on how JSON::XS maps perl values to JSON values and vice versa. @@ -382,6 +380,8 @@ This setting has no effect when decoding JSON texts. + This setting has currently no effect on tied hashes. + $json = $json->allow_nonref ([$enable]) $enabled = $json->get_allow_nonref If $enable is true (or missing), then the "encode" method can @@ -631,11 +631,19 @@ stream incrementally. It does so by accumulating text until it has a full JSON object, which it then can decode. This process is similar to using "decode_prefix" to see if a full JSON object is available, but is - much more efficient (JSON::XS will only attempt to parse the JSON text - once it is sure it has enough text to get a decisive result, using a - very simple but truly incremental parser). + much more efficient (and can be implemented with a minimum of method + calls). + + JSON::XS will only attempt to parse the JSON text once it is sure it has + enough text to get a decisive result, using a very simple but truly + incremental parser. This means that it sometimes won't stop as early as + the full parser, for example, it doesn't detect parenthese mismatches. + The only thing it guarantees is that it starts decoding as soon as a + syntactically valid JSON text has been seen. This means you need to set + resource limits (e.g. "max_size") to ensure the parser will stop parsing + in the presence if syntax errors. - The following two methods deal with this. + The following methods implement this incremental parser. [void, scalar or list context] = $json->incr_parse ([$string]) This is the central parsing function. It can both append new text @@ -680,16 +688,19 @@ $json->incr_skip This will reset the state of the incremental parser and will remove - the parsed text from the input buffer. This is useful after + the parsed text from the input buffer so far. This is useful after "incr_parse" died, in which case the input buffer and incremental parser state is left unchanged, to skip the text parsed so far and to reset the parse state. + The difference to "incr_reset" is that only text until the parse + error occured is removed. + $json->incr_reset This completely resets the incremental parser, that is, after this call, it will be as if the parser had never parsed anything. - This is useful if you want ot repeatedly parse JSON objects and want + This is useful if you want to repeatedly parse JSON objects and want to ignore any trailing data, which means you have to reset the parser after each successful decode. @@ -1065,6 +1076,69 @@ in mail), and works because ASCII is a proper subset of most 8-bit and multibyte encodings in use in the world. + JSON and ECMAscript + JSON syntax is based on how literals are represented in javascript (the + not-standardised predecessor of ECMAscript) which is presumably why it + is called "JavaScript Object Notation". + + However, JSON is not a subset (and also not a superset of course) of + ECMAscript (the standard) or javascript (whatever browsers actually + implement). + + If you want to use javascript's "eval" function to "parse" JSON, you + might run into parse errors for valid JSON texts, or the resulting data + structure might not be queryable: + + One of the problems is that U+2028 and U+2029 are valid characters + inside JSON strings, but are not allowed in ECMAscript string literals, + so the following Perl fragment will not output something that can be + guaranteed to be parsable by javascript's "eval": + + use JSON::XS; + + print encode_json [chr 0x2028]; + + The right fix for this is to use a proper JSON parser in your javascript + programs, and not rely on "eval" (see for example Douglas Crockford's + json2.js parser). + + If this is not an option, you can, as a stop-gap measure, simply encode + to ASCII-only JSON: + + use JSON::XS; + + print JSON::XS->new->ascii->encode ([chr 0x2028]); + + Note that this will enlarge the resulting JSON text quite a bit if you + have many non-ASCII characters. You might be tempted to run some regexes + to only escape U+2028 and U+2029, e.g.: + + # DO NOT USE THIS! + my $json = JSON::XS->new->utf8->encode ([chr 0x2028]); + $json =~ s/\xe2\x80\xa8/\\u2028/g; # escape U+2028 + $json =~ s/\xe2\x80\xa9/\\u2029/g; # escape U+2029 + print $json; + + Note that *this is a bad idea*: the above only works for U+2028 and + U+2029 and thus only for fully ECMAscript-compliant parsers. Many + existing javascript implementations, however, have issues with other + characters as well - using "eval" naively simply *will* cause problems. + + Another problem is that some javascript implementations reserve some + property names for their own purposes (which probably makes them + non-ECMAscript-compliant). For example, Iceweasel reserves the + "__proto__" property name for it's own purposes. + + If that is a problem, you could parse try to filter the resulting JSON + output for these property strings, e.g.: + + $json =~ s/"__proto__"\s*:/"__proto__renamed":/g; + + This works because "__proto__" is not valid outside of strings, so every + occurence of ""__proto__"\s*:" must be a string used as property name. + + If you know of other incompatibilities, please let me know. + JSON and YAML You often hear that JSON is a subset of YAML. This is, however, a mass hysteria(*) and very far from the truth (as of the time of this @@ -1081,10 +1155,10 @@ This will *usually* generate JSON texts that also parse as valid YAML. Please note that YAML has hardcoded limits on (simple) object key lengths that JSON doesn't have and also has different and incompatible - unicode handling, so you should make sure that your hash keys are - noticeably shorter than the 1024 "stream characters" YAML allows and - that you do not have characters with codepoint values outside the - Unicode BMP (basic multilingual page). YAML also does not allow "\/" + unicode character escape syntax, so you should make sure that your hash + keys are noticeably shorter than the 1024 "stream characters" YAML + allows and that you do not have characters with codepoint values outside + the Unicode BMP (basic multilingual page). YAML also does not allow "\/" sequences in strings (which JSON::XS does not *currently* generate, but other JSON generators might). @@ -1111,6 +1185,12 @@ spreading lies about the real compatibility for many *years* and trying to silence people who point out that it isn't true. + Addendum/2009: the YAML 1.2 spec is still incomaptible with JSON, + even though the incompatibilities have been documented (and are + known to Brian) for many years and the spec makes explicit claims + that YAML is a superset of JSON. It would be so easy to fix, but + apparently, bullying and corrupting userdata is so much easier. + SPEED It seems that JSON::XS is surprisingly fast, as shown in the following tables. They have been generated with the help of the "eg/bench" program @@ -1123,49 +1203,48 @@ {"method": "handleMessage", "params": ["user1", "we were just talking"], "id": null, "array":[1,11,234,-5,1e5,1e7, - true, false]} + 1, 0]} It shows the number of encodes/decodes per second (JSON::XS uses the functional interface, while JSON::XS/2 uses the OO interface with - pretty-printing and hashkey sorting enabled, JSON::XS/3 enables shrink). - Higher is better: - - module | encode | decode | - -----------|------------|------------| - JSON 1.x | 4990.842 | 4088.813 | - JSON::DWIW | 51653.990 | 71575.154 | - JSON::PC | 65948.176 | 74631.744 | - JSON::PP | 8931.652 | 3817.168 | - JSON::Syck | 24877.248 | 27776.848 | - JSON::XS | 388361.481 | 227951.304 | - JSON::XS/2 | 227951.304 | 218453.333 | - JSON::XS/3 | 338250.323 | 218453.333 | - Storable | 16500.016 | 135300.129 | - -----------+------------+------------+ - - That is, JSON::XS is about five times faster than JSON::DWIW on - encoding, about three times faster on decoding, and over forty times - faster than JSON, even with pretty-printing and key sorting. It also + pretty-printing and hashkey sorting enabled, JSON::XS/3 enables shrink. + JSON::DWIW/DS uses the deserialise function, while JSON::DWIW::FJ uses + the from_json method). Higher is better: + + module | encode | decode | + --------------|------------|------------| + JSON::DWIW/DS | 86302.551 | 102300.098 | + JSON::DWIW/FJ | 86302.551 | 75983.768 | + JSON::PP | 15827.562 | 6638.658 | + JSON::Syck | 63358.066 | 47662.545 | + JSON::XS | 511500.488 | 511500.488 | + JSON::XS/2 | 291271.111 | 388361.481 | + JSON::XS/3 | 361577.931 | 361577.931 | + Storable | 66788.280 | 265462.278 | + --------------+------------+------------+ + + That is, JSON::XS is almost six times faster than JSON::DWIW on + encoding, about five times faster on decoding, and over thirty to + seventy times faster than JSON's pure perl implementation. It also compares favourably to Storable for small amounts of data. Using a longer test string (roughly 18KB, generated from Yahoo! Locals search API (). - module | encode | decode | - -----------|------------|------------| - JSON 1.x | 55.260 | 34.971 | - JSON::DWIW | 825.228 | 1082.513 | - JSON::PC | 3571.444 | 2394.829 | - JSON::PP | 210.987 | 32.574 | - JSON::Syck | 552.551 | 787.544 | - JSON::XS | 5780.463 | 4854.519 | - JSON::XS/2 | 3869.998 | 4798.975 | - JSON::XS/3 | 5862.880 | 4798.975 | - Storable | 4445.002 | 5235.027 | - -----------+------------+------------+ + module | encode | decode | + --------------|------------|------------| + JSON::DWIW/DS | 1647.927 | 2673.916 | + JSON::DWIW/FJ | 1630.249 | 2596.128 | + JSON::PP | 400.640 | 62.311 | + JSON::Syck | 1481.040 | 1524.869 | + JSON::XS | 20661.596 | 9541.183 | + JSON::XS/2 | 10683.403 | 9416.938 | + JSON::XS/3 | 20661.596 | 9400.054 | + Storable | 19765.806 | 10000.725 | + --------------+------------+------------+ Again, JSON::XS leads by far (except for Storable which non-surprisingly - decodes faster). + decodes a bit faster). On large strings containing lots of high Unicode characters, some modules (such as JSON::PC) seem to decode faster than JSON::XS, but the @@ -1210,11 +1289,11 @@ If you are using JSON::XS to return packets to consumption by JavaScript scripts in a browser you should have a look at - to see whether - you are vulnerable to some common attack vectors (which really are - browser design bugs, but it is still you who will have to deal with it, - as major browser developers care only for features, not about getting - security right). + + to see whether you are vulnerable to some common attack vectors (which + really are browser design bugs, but it is still you who will have to + deal with it, as major browser developers care only for features, not + about getting security right). THREADS This module is *not* guaranteed to be thread safe and there are no plans