JS combination of several methods and advantages and disadvantages of array comparison

  • 2020-03-30 03:57:23
  • OfStack

This article is a basic JavaScript skill. We will learn the various common ways to combine/merge two JS arrays and compare their advantages and disadvantages.

Let's look at the specific scenario first:


var q = [ 5, 5, 1, 9, 9, 6, 4, 5, 8];
var b = [ "tie", "mao", "csdn", "ren", "fu", "fei" ];

Obviously, the simple splicing of arrays q and b results in:


[
    5, 5, 1, 9, 9, 6, 4, 5, 8,
    "tie", "mao", "csdn", "ren", "fu", "fei"
]

Concat (..) methods

The most common usage is as follows:


var c = q.concat( b ); q; // [5,5,1,9,9,6,4,5,8]
b; // ["tie","mao","csdn","ren","fu","fei"]; c; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]

As you can see, c is a whole new array that represents a combination of two arrays, q and b, but q and b are useless now, right?

If the q array has 10,000 elements, the b array has 10,000 elements ? So the array c now has 20,000 elements, which takes up twice as much memory.

"That's no problem! You might think, just leave q and b blank, and it will be garbage collected, right? Problem solved!


q = b = null; //'q' and 'b' can now be garbage collected

The forehead & # 63; If the arrays are small, that's fine, but for large arrays, or if they require multiple iterations, memory is limited and it needs to be optimized.

Circular insert

OK, let's try adding the contents of one array to the other, using the Array#push() method:


//Insert the array 'b' into 'q'
for (var i=0; i < b.length; i++) {
    q.push( b[i] );
} q; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"] b = null;

Now, q holds the contents of the two original arrays (q + b).

Looks like a good memory optimization.

But what if the q array is small and the b array is large ? For the sake of memory and speed, I want to insert a smaller q in front of b.


// `q` into `b`:
for (var i=q.length-1; i >= 0; i--) {
    b.unshift( q[i] );
} b; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"] q = null;

Practical skills

Unfortunately, the for loop is dirt and hard to maintain. Can we do better?
Let's try Array#reduce first:


// `b` onto `q`:
q = b.reduce( function(coll,item){
    coll.push( item );
    return coll;
}, q ); q; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"] // or `q` into `b`:
b = q.reduceRight( function(coll,item){
    coll.unshift( item );
    return coll;
}, b ); b; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]

Array#reduce() and Array#reduceRight() are lofty, but a bit ponderous and unmemorable. JS specification 6 => Arrow-functions are a great way to reduce the amount of code, but they require function calls to each array element.
What about the following code ?


// `b` onto `q`:
q.push.apply( q, b ); q; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"] // or `q` into `b`:
b.unshift.apply( b, q ); b; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]

BIG is taller, isn't it? The & # 63; In particular, the unshift() method does not need to consider the reverse order as before. Prefix) is even higher: a.ush (... B) or b) shift(... A)

In both cases, whether you pass a or b to apply() as the second argument (the first argument becomes this internally when the apply method calls Function), or use... The way you expand the operator, the array is actually broken into arguments for the function.
The first major problem is that it takes up twice as much memory (temporary, of course!). In addition, different JS engines have different implementation algorithms, which may limit the number of arguments a function can pass.

If you add a million elements to an array, you're bound to exceed the size of the function stack, whether it's a push() or an unshift() call.

Note: you can also try splice() and be sure to find it and push(..) / unshift (..) It's the same restriction.

One option is to continue using this method, but in batches:


function combineInto(q,b) {
    var len = q.length;
    for (var i=0; i < len; i=i+5000) {
        //Process 5000
at a time         b.unshift.apply( b, q.slice( i, i+5000 ) );
    }
}

Wait, we hurt the readability of the code (and even the performance!) .end the journey before we give up.

conclusion

Array#concat() is a time-tested method for combining two (or more) arrays, but it creates a new array instead of modifying an existing one.

There are many alternatives, but they all have different strengths and weaknesses and need to be chosen according to the actual situation.

There are various advantages/disadvantages listed above, and perhaps the best (including the ones not listed) approach is reduce(..). And reduceRight (..)

Whatever you choose, think critically about your array merge strategy, not take it for granted.


Related articles: