ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

Roma Bronstein outsidenote at gmail.com
Fri Jun 21 20:32:34 UTC 2019


Hi,

It's my first time suggesting a feature, hope I'm doing it correctly.

I really like using Array.prototype.map(), Array.prototype.reduce() and all
related functions.
The code is more readable and it looks better.
However, when I want to write performance sensitive code, chaining these
functions is not a good approach.
For example, writing this:
// a is an Array of length N
const b = a.filter().map()

will require 2 traversals over the whole array, up to 2*N iterations (if
the filter passes all items).

This is why I often resort to writing this:
const b= []
a.forEach(() => {
  if (/*the filter condition*/)
    b.push(/*mapping logic*/)
})

Which requires only N iterations.

I suggest adding a capability to streamline items to these functions.
I get my inspiration from Redis's transaction syntax where you declare
starting a transaction and finally call EXEC in order to execute it.
So now I'll be able to write something like this:
const b = a.stream()
  .filter()
  .map()
  .exec()

Just to clarify the example:
I've declared that I'd like to stream array items of a. Then I've chained
the functions I'd like to items to pass through.
Finally I've activated it using the exec() function.

I'm not sure if this is the best syntactical approach, but this example is
more intuitive to understand in my opinion.

Another approach could be thinking about a "pipeline" operator like in UNIX
cli, providing a more generic capability to pipeline iterators.

Again, I hope I'm doing this correctly and in the right forum.
And if so, I'd be happy to hear some feedback.

Thanks,
Roma
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.mozilla.org/pipermail/es-discuss/attachments/20190621/99af4869/attachment.html>


More information about the es-discuss mailing list