std.data.json formal review

Marco Leise via Digitalmars-d digitalmars-d at puremagic.com
Sun Sep 27 10:43:36 PDT 2015


Am Mon, 03 Aug 2015 12:11:14 +0300
schrieb Dmitry Olshansky <dmitry.olsh at gmail.com>:

> [...]
>
> Now back to our land let's look at say rapidJSON.
> 
> It MAY seem to handle big integers:
> https://github.com/miloyip/rapidjson/blob/master/include/rapidjson/internal/biginteger.h
> 
> But it's used only to parse doubles:
> https://github.com/miloyip/rapidjson/pull/137
> 
> Anyhow the API says it all - only integers up to 64bit and doubles:
> 
> http://rapidjson.org/md_doc_sax.html#Handler
> 
> Pretty much what I expect by default.
> And plz-plz don't hardcode BitInteger in JSON parser, it's slow plus it 
> causes epic code bloat as Don already pointed out.

I would take RapidJSON with a grain of salt, its main goal is
to be the fastest JSON parser. Nothing wrong with that, but
BigInt and fast doesn't naturally match and the C standard
library also doesn't come with a BigInt type that could
conveniently be plugged in.
Please compare again with JSON parsers in languages that
provide BigInts, e.g. Ruby:
http://ruby-doc.org/stdlib-1.9.3/libdoc/json/rdoc/JSON/Ext/Generator/GeneratorMethods/Bignum.html
Optional ok, but no support at all would be so 90s.

My impression is that the standard wants to allow JSON being
used in environments that cannot provide BigInt support, but a
modern language for PCs with a BigInt module should totally
support reading long integers and be able to do proper
rounding of double values. I thought about reading two
BigInts: one for the significand and one for the
base-10 exponent, so you don't need a BigFloat but have the
full accuracy from the textual string still as x*10^y.

-- 
Marco



More information about the Digitalmars-d mailing list