For an associative operation f
over the elements of array a
, the following relation should hold true: a.reduce(f)
should be equivalent to a.reduceRight(f)
.
Indeed, it does hold true for operations that are both associative and commutative. For example:
const a = [0,1,2,3,4,5,6,7,8,9];
const add = (a, b) => a + b;
console.log(a.reduce(add));
console.log(a.reduceRight(add));
However it doesn't hold true for operations that are associative but not commutative. For example:
const a = [[0,1],[2,3],[4,5],[6,7],[8,9]];
const concat = (a, b) => a.concat(b);
console.log(JSON.stringify(a.reduce(concat)));
console.log(JSON.stringify(a.reduceRight(concat)));
We need to flip the arguments of f
for reduceRight
to make them equivalent:
const a = [[0,1],[2,3],[4,5],[6,7],[8,9]];
const concat = (a, b) => a.concat(b);
const concatRight = (b, a) => a.concat(b);
console.log(JSON.stringify(a.reduce(concat)));
console.log(JSON.stringify(a.reduceRight(concatRight)));
This makes me believe that the native implementation of reduceRight
is wrong.
I believe that the reduceRight
function should be implemented as follows:
var REDUCE_ERROR = "Reduce of empty array with no initial value";
Array.prototype.reduceRight = function (f, acc) {
let { length } = this;
const noAcc = arguments.length < 2;
if (noAcc && length === 0) throw new TypeError(REDUCE_ERROR);
let result = noAcc ? this[--length] : acc;
while (length > 0) result = f(this[--length], result, length, this);
return result;
};
Since result
represents the previous value (right-hand side value), it makes sense to make it the second parameter of the function f
. The current value represents the left-hand side value. Hence, it makes sense to make the current value the first parameter of the function f
. This way, even for non-commutative associative operations, the aforementioned relation holds true.
So, my questions are:
- Doesn't it indeed make more sense for
reduceRight
to be implemented the way I did? - Why is the native
reduceRight
not implemented the way I did?
Maybe. However, the JavaScript array iterators do not come from a pure functional programming background.
Because it's simpler (easier to remember) to have the same parameter order, the accumulator always first.
The primitive operation on arrays is
reduce
, which iterates from 0 to n-1 as always. Only in Haskell with its recursively-built listsfoldr
makes more sense (has thebuild
duality, works well lazily on infinite lists…). Notice how the naming is notreduce
+reduceLeft
…Then
reduceRight
does not inverse the fold operation, it just reverses the iteration order. This is also how it is typically explained in documentation and tutorials, for example in The Definitive Guide:Also the first implementation of
reduce
/reduceRight
(see Bug 363040) in Mozilla's array extras for JS 1.8 followed this approach: It just flipped start with end and negated the step value.The notes of Dave Herman for the ES4 spec followed this line of thought. It does mention Haskell, but the whole document does not deal with the parameter order of the
callback
at all. Maybe the distinct order was lost in Haskells uncommon syntax, or the canonical type names so that both signatures began with(a -> b -> …
. More discussion went into the missingthisObject
parameter.Some relevant excerpts:
And finally, that is what got into the EcmaScript spec: