Please help with writing spec for async JSON APIs
Allen Wirfs-Brock
allen at wirfs-brock.com
Mon Aug 3 17:29:33 UTC 2015
On Aug 3, 2015, at 9:02 AM, James M Snell wrote:
> On Mon, Aug 3, 2015 at 8:34 AM, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote:
> [snip]
>>
>> 4) JSON.parse/stringify are pure computational operations. There is no
>> perf benefit to making them asynchronous unless some of their computation
>> can be performed concurrently.
>>
>
> If we're speaking strictly about making the JSON parsing asynchronous,
> then correct, there is really no performance benefit to speak of. You
> may be able to offload the parsing to a separate thread, but it's
> going to take the same amount of time. The real benefit will come when
> (a) JSON parsing becomes incremental
yes, incremental is good. But do you really mean just "parsing" rather than "processing"?
> and (b) a developer is given
> greater control over exactly how the JSON is converted to/from
> strings.
Strictly speaking JSON is strings. JSON.stringify/parse converts JS values (including objects) to/from such strings.
>
> Something along the lines of...
>
> JSON.parser(input).
> on('key', function(key, context) {
> if (key === 'foo')
> console.log(context.value());
> else if (key === 'bar')
> context.on('key', ...);
> }).
> on('end', function() {
> });
>
I have to guess at your semantics, but what you are trying to express above seems like something that can already be accomplished using the `reviver` argument to JSON.parse.
> In other words: allowing for incremental access to the stream and fine
> grained control over the parsing process, rather than having to block
> while everything is parsed out, building up the in-memory object
> model, then being forced to walk that model in order to do anything
> interesting.
>
> Personally, I'm not overly concerned about the possibility of races.
But, TC39 is concerned about races.
>
> - James
>
More information about the es-discuss
mailing list