Please help with writing spec for async JSON APIs
bjouhier at gmail.com
Mon Aug 3 09:38:56 UTC 2015
The SAX approach is not ideal for JSON because we don't want the overhead
of a callback on every key (especially if parsing and callbacks are handled
by different threads).
To be efficient we need is a hybrid approach with an evented API (SAX-like)
for top level keys, and direct mapping to JS for deeper keys. In the feed
case, you only need one event for the header, one for every entry and one
for the trailer. In the i-json API I'm handling this with a `maxDepth`
option in the parser constructor.
2015-08-03 3:25 GMT+02:00 Brendan Eich <brendan at mozilla.org>:
> Exactly! Incremental and async, i.e., streaming.
> XML quickly needed such APIs (
> https://en.wikipedia.org/wiki/StAX). JSON's in the same boat.
> Bruno Jouhier wrote:
>> A common use case is large JSON feeds: header + lots of entries + trailer
>> When processing such feeds, you should not bring the whole JSON in memory
>> all at once. Instead you should process the feed incrementally.
>> So, IMO, an alternate API should not be just asynchronous, it should also
>> be incremental.
>> FWIW, I have implemented an incremental/evented parser for V8 with a
>> simple API. This parser is incremental but not async (because V8 imposes
>> that materialization happen in the main JS thread). But, if the V8
>> restriction could be lifted, it could be made async with the same API. See
>> i-json's API is a simple low level API. A more sophisticated solution
>> would be a duplex stream.
>> There was also a long discussion on this topic on node's GitHub:
>> es-discuss mailing list
>> es-discuss at mozilla.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the es-discuss