--- JSON-XS/README 2008/03/27 06:37:35 1.24 +++ JSON-XS/README 2009/02/19 01:13:46 1.29 @@ -34,7 +34,7 @@ Beginning with version 2.0 of the JSON module, when both JSON and JSON::XS are installed, then JSON will fall back on JSON::XS (this can - be overriden) with no overhead due to emulation (by inheritign + be overridden) with no overhead due to emulation (by inheriting constructor and methods). If JSON::XS is not available, it will fall back to the compatible JSON::PP module as backend, so using JSON instead of JSON::XS gives you a portable JSON API that can be fast when you need @@ -46,8 +46,6 @@ cases their maintainers are unresponsive, gone missing, or not listening to bug reports for other reasons. - See COMPARISON, below, for a comparison to some other JSON modules. - See MAPPING, below, on how JSON::XS maps perl values to JSON values and vice versa. @@ -59,7 +57,7 @@ * round-trip integrity - When you serialise a perl data structure using only datatypes + When you serialise a perl data structure using only data types supported by JSON, the deserialised data structure is identical on the Perl level. (e.g. the string "2.0" doesn't suddenly become "2" just because it looks like a number). There minor *are* exceptions @@ -80,12 +78,12 @@ * simple to use This module has both a simple functional interface as well as an - objetc oriented interface interface. + object oriented interface interface. * reasonably versatile output formats You can choose between the most compact guaranteed-single-line - format possible (nice for simple line-based protocols), a pure-ascii + format possible (nice for simple line-based protocols), a pure-ASCII format (for when your transport is not 8-bit clean, still supports the whole Unicode range), or a pretty-printed format (for when you want to read that stuff). Or you can combine those features in @@ -103,7 +101,7 @@ $json_text = JSON::XS->new->utf8->encode ($perl_scalar) - except being faster. + Except being faster. $perl_scalar = decode_json $json_text The opposite of "encode_json": expects an UTF-8 (binary) string and @@ -114,7 +112,7 @@ $perl_scalar = JSON::XS->new->utf8->decode ($json_text) - except being faster. + Except being faster. $is_boolean = JSON::XS::is_bool $scalar Returns true if the passed scalar represents either JSON::XS::true @@ -154,7 +152,7 @@ doesn't exist. 4. A "Unicode String" is simply a string where each character can be - validly interpreted as a Unicode codepoint. + validly interpreted as a Unicode code point. If you have UTF-8 encoded data, it is no longer a Unicode string, but a Unicode string encoded in UTF-8, giving you a binary string. @@ -400,6 +398,21 @@ JSON::XS->new->allow_nonref->encode ("Hello, World!") => "Hello, World!" + $json = $json->allow_unknown ([$enable]) + $enabled = $json->get_allow_unknown + If $enable is true (or missing), then "encode" will *not* throw an + exception when it encounters values it cannot represent in JSON (for + example, filehandles) but instead will encode a JSON "null" value. + Note that blessed objects are not included here and are handled + separately by c. + + If $enable is false (the default), then "encode" will throw an + exception when it encounters anything it cannot encode as JSON. + + This option does not affect "decode" in any way, and it is + recommended to leave it off unless you know your communications + partner. + $json = $json->allow_blessed ([$enable]) $enabled = $json->get_allow_blessed If $enable is true (or missing), then the "encode" method will not @@ -543,9 +556,9 @@ $json = $json->max_depth ([$maximum_nesting_depth]) $max_depth = $json->get_max_depth Sets the maximum nesting level (default 512) accepted while encoding - or decoding. If the JSON text or Perl data structure has an equal or - higher nesting level then this limit, then the encoder and decoder - will stop and croak at that point. + or decoding. If a higher nesting level is detected in JSON text or a + Perl data structure, then the encoder and decoder will stop and + croak at that point. Nesting level is defined by number of hash- or arrayrefs that the encoder needs to traverse to reach a given point or the number of @@ -555,9 +568,12 @@ Setting the maximum depth to one disallows any nesting, so that ensures that the object is only a single hash/object or array. - The argument to "max_depth" will be rounded up to the next highest - power of two. If no argument is given, the highest possible setting - will be used, which is rarely useful. + If no argument is given, the highest possible setting will be used, + which is rarely useful. + + Note that nesting is implemented by recursion in C. The default + value has been chosen to be as large as typical operating systems + allow without crashing. See SECURITY CONSIDERATIONS, below, for more info on why this is useful. @@ -566,14 +582,12 @@ $max_size = $json->get_max_size Set the maximum length a JSON text may have (in bytes) where decoding is being attempted. The default is 0, meaning no limit. - When "decode" is called on a string longer then this number of - characters it will not attempt to decode the string but throw an + When "decode" is called on a string that is longer then this many + bytes, it will not attempt to decode the string but throw an exception. This setting has no effect on "encode" (yet). - The argument to "max_size" will be rounded up to the next highest - power of two (so may be more than requested). If no argument is - given, the limit check will be deactivated (same as when 0 is - specified). + If no argument is given, the limit check will be deactivated (same + as when 0 is specified). See SECURITY CONSIDERATIONS, below, for more info on why this is useful. @@ -609,19 +623,25 @@ => ([], 3) INCREMENTAL PARSING - [This section and the API it details is still EXPERIMENTAL] - In some cases, there is the need for incremental parsing of JSON texts. While this module always has to keep both JSON text and resulting Perl data structure in memory at one time, it does allow you to parse a JSON stream incrementally. It does so by accumulating text until it has a full JSON object, which it then can decode. This process is similar to using "decode_prefix" to see if a full JSON object is available, but is - much more efficient (JSON::XS will only attempt to parse the JSON text - once it is sure it has enough text to get a decisive result, using a - very simple but truly incremental parser). + much more efficient (and can be implemented with a minimum of method + calls). - The following two methods deal with this. + JSON::XS will only attempt to parse the JSON text once it is sure it has + enough text to get a decisive result, using a very simple but truly + incremental parser. This means that it sometimes won't stop as early as + the full parser, for example, it doesn't detect parenthese mismatches. + The only thing it guarantees is that it starts decoding as soon as a + syntactically valid JSON text has been seen. This means you need to set + resource limits (e.g. "max_size") to ensure the parser will stop parsing + in the presence if syntax errors. + + The following methods implement this incremental parser. [void, scalar or list context] = $json->incr_parse ([$string]) This is the central parsing function. It can both append new text @@ -666,11 +686,22 @@ $json->incr_skip This will reset the state of the incremental parser and will remove - the parsed text from the input buffer. This is useful after + the parsed text from the input buffer so far. This is useful after "incr_parse" died, in which case the input buffer and incremental parser state is left unchanged, to skip the text parsed so far and to reset the parse state. + The difference to "incr_reset" is that only text until the parse + error occured is removed. + + $json->incr_reset + This completely resets the incremental parser, that is, after this + call, it will be as if the parser had never parsed anything. + + This is useful if you want to repeatedly parse JSON objects and want + to ignore any trailing data, which means you have to reset the + parser after each successful decode. + LIMITATIONS All options that affect decoding are supported, except "allow_nonref". The reason for this is that it cannot be made to work sensibly: JSON @@ -899,7 +930,7 @@ can also use "JSON::XS::false" and "JSON::XS::true" to improve readability. - encode_json [\0,JSON::XS::true] # yields [false,true] + encode_json [\0, JSON::XS::true] # yields [false,true] JSON::XS::true, JSON::XS::false These special values become JSON true and JSON false values, @@ -1043,6 +1074,69 @@ in mail), and works because ASCII is a proper subset of most 8-bit and multibyte encodings in use in the world. + JSON and ECMAscript + JSON syntax is based on how literals are represented in javascript (the + not-standardised predecessor of ECMAscript) which is presumably why it + is called "JavaScript Object Notation". + + However, JSON is not a subset (and also not a superset of course) of + ECMAscript (the standard) or javascript (whatever browsers actually + implement). + + If you want to use javascript's "eval" function to "parse" JSON, you + might run into parse errors for valid JSON texts, or the resulting data + structure might not be queryable: + + One of the problems is that U+2028 and U+2029 are valid characters + inside JSON strings, but are not allowed in ECMAscript string literals, + so the following Perl fragment will not output something that can be + guaranteed to be parsable by javascript's "eval": + + use JSON::XS; + + print encode_json [chr 0x2028]; + + The right fix for this is to use a proper JSON parser in your javascript + programs, and not rely on "eval" (see for example Douglas Crockford's + json2.js parser). + + If this is not an option, you can, as a stop-gap measure, simply encode + to ASCII-only JSON: + + use JSON::XS; + + print JSON::XS->new->ascii->encode ([chr 0x2028]); + + Note that this will enlarge the resulting JSON text quite a bit if you + have many non-ASCII characters. You might be tempted to run some regexes + to only escape U+2028 and U+2029, e.g.: + + # DO NOT USE THIS! + my $json = JSON::XS->new->utf8->encode ([chr 0x2028]); + $json =~ s/\xe2\x80\xa8/\\u2028/g; # escape U+2028 + $json =~ s/\xe2\x80\xa9/\\u2029/g; # escape U+2029 + print $json; + + Note that *this is a bad idea*: the above only works for U+2028 and + U+2029 and thus only for fully ECMAscript-compliant parsers. Many + existing javascript implementations, however, have issues with other + characters as well - using "eval" naively simply *will* cause problems. + + Another problem is that some javascript implementations reserve some + property names for their own purposes (which probably makes them + non-ECMAscript-compliant). For example, Iceweasel reserves the + "__proto__" property name for it's own purposes. + + If that is a problem, you could parse try to filter the resulting JSON + output for these property strings, e.g.: + + $json =~ s/"__proto__"\s*:/"__proto__renamed":/g; + + This works because "__proto__" is not valid outside of strings, so every + occurence of ""__proto__"\s*:" must be a string used as property name. + + If you know of other incompatibilities, please let me know. + JSON and YAML You often hear that JSON is a subset of YAML. This is, however, a mass hysteria(*) and very far from the truth (as of the time of this @@ -1099,8 +1193,9 @@ single-line JSON string (also available at ). - {"method": "handleMessage", "params": ["user1", "we were just talking"], \ - "id": null, "array":[1,11,234,-5,1e5,1e7, true, false]} + {"method": "handleMessage", "params": ["user1", + "we were just talking"], "id": null, "array":[1,11,234,-5,1e5,1e7, + true, false]} It shows the number of encodes/decodes per second (JSON::XS uses the functional interface, while JSON::XS/2 uses the OO interface with @@ -1203,9 +1298,8 @@ BUGS While the goal of this module is to be correct, that unfortunately does - not mean it's bug-free, only that I think its design is bug-free. It is - still relatively early in its development. If you keep reporting bugs - they will be fixed swiftly, though. + not mean it's bug-free, only that I think its design is bug-free. If you + keep reporting bugs they will be fixed swiftly, though. Please refrain from using rt.cpan.org or any other bug reporting service. I put the contact address into my modules for a reason.