HiveBrain v1.2.0
Get Started
← Back to all entries
patternjavascriptMinor

Codewars: Sum of Pairs

Submitted by: @import:stackexchange-codereview··
0
Viewed 0 times
pairscodewarssum

Problem

I've been working on this task (see screenshot).

Here's my answer to it so far:

var sum_pairs=function(ints, s){
  //your code here

  var ptr_1 = 0,
      ptr_2 = 0,
      i = 0,
      j = 0,
      len = ints.length,
      j_min = len,
      arr_sum = [];

  for (i; i < len-1; i++){
    for (j = i+1; j < len; j++){
      if(ints[i] + ints[j] === s){
        if (j < j_min){
          j_min = j;
          arr_sum = [parseInt(ints[i]), parseInt(ints[j])]
        }
      }
    }
  }

return ( (arr_sum.length === 0)? undefined : arr_sum ); 

}


However, while the tests all pass, it throws an STDERR about 'optimising the code':

How might I be able to optimise my code further?

Solution

Straightforward enumeration algorithms utilizing 2-layered loops will always take a lot of time on extremely large arrays where a possibility exists that the first match is achieved by elements near the start/end of the array (e.g. indices 515 and 9123456 of a 10 million elements array), which would lead to the same excessive number of iterations as the original code performed.

Here's an updated code that performs much faster on the average. The idea is to build a map of unique values and their minimal indices in the source array, and simultaneously check if there's an entry for the complementary number (addend) already.

var sum_pairs = function(ints, s) {
    var map = {},
        pair, pairMaxIndex = ints.length - 1;

    for (var i = 0; i  j ? i : j;
            pair = i < j ? [a, b] : [b, a];
        }
        var tmp = map[a];
        if (tmp === undefined || i < tmp) {
            map[a] = i;
        }
    }
    return pair;
};



10 million randoms in -100..100 range: 1ms - 1000ms

10 million randoms in -1,000,000..1,000,000 range: 1ms - 5000ms

10 million randoms in -10,000,000..10,000,000 range: 1ms - crash

1 million randoms in -1,000,000,000..1,000,000,000 range: 1ms - 2000ms

The worst time in each case was caused by artificially crafting an array described above.

The crash in the 3rd test occured because the number of unique values neared 1G and the object exceeded 32-bit Javascript engine's memory limit of 2 GB. All tests were run on i7 CPU.

The code above incorporates the following optimizations as well:

  • Flambino's answer shows there's no need to continue enumeration if current index is bigger than the matching values' max index found already.



  • Jodrell's answer shows it's possible to do everything in just one loop, pointed out by Jonah.



And the tests:

console.clear();
var arr = [], max = 1000000;
for (var i = 0; i<10000000; i++) arr[i] = Math.random() * max * 2 - max |0;
console.log('array of', arr.length, 'elements is built');

console.time(1);
console.log(sum_pairs(arr, 1000));
console.timeEnd(1);

Code Snippets

var sum_pairs = function(ints, s) {
    var map = {},
        pair, pairMaxIndex = ints.length - 1;

    for (var i = 0; i <= pairMaxIndex; i++) {
        var a = ints[i];
        var b = s - a;
        var j = map[b];
        if (j !== undefined && i <= pairMaxIndex && j <= pairMaxIndex) {
            pairMaxIndex = i > j ? i : j;
            pair = i < j ? [a, b] : [b, a];
        }
        var tmp = map[a];
        if (tmp === undefined || i < tmp) {
            map[a] = i;
        }
    }
    return pair;
};
console.clear();
var arr = [], max = 1000000;
for (var i = 0; i<10000000; i++) arr[i] = Math.random() * max * 2 - max |0;
console.log('array of', arr.length, 'elements is built');

console.time(1);
console.log(sum_pairs(arr, 1000));
console.timeEnd(1);

Context

StackExchange Code Review Q#138213, answer score: 3

Revisions (0)

No revisions yet.