I have the following deterministic noise function which I've been using in a C# and C++ terrain generator for a while:

float GridNoise(int x, int z, int seed)
{
    int n = (1619*x + 31337*z + 1013*seed) & 0x7fffffff;
    n = (n >> 13) ^ n;

    return 1 - ((n*(n*n*60493 + 19990303) + 1376312589) & 0x7fffffff)/(float)1073741824;
}

It returns a 'random' float between 1 and -1 for any integer x/z coordinates I enter (plus there's a seed so I can generate different terrains). I tried implementing the same function in Javascript, but the results aren't as expected. For small values, it seems OK but as I use larger values (of the order of ~10000) the results are less and less random and eventually all it returns is 1.

You can see it working correctly in C# here, and the incorrect JS results for the same input here.

I suspect it's something to do with JS variables not being strict integers, but can anyone shed more light? Does anyone have a similarly simple deterministic function I could use in JS if this doesn't work?

4 Answers

0
obscure On

I'm afraid your code is exceeding the maximum size limit for integers. As soon as that happens, it's returning 1 because the calculation ((n*(n*n*60493 + 19990303) + 1376312589) & 0x7fffffff)/1073741824 will always be 0 - thus 1 - 0 = 1

1
Jaromanda X On

The underlying problem is, in javascript, there's no integers - so all mathematical functions are done using Number (52bit precision float)

In c#, if you're using longs, then any overflows are just discarded

In javascript, you need to handle this yourself

There's a numeric format is coming to browsers that will help, but it's not here yet - BigInt ... it's in chrome/opera and behind a flag in firefox (desktop, not android)

(no word on Edge (dead anyway) or Safari (the new IE) - and of course, IE will never get them)

The best I can come up with using BigInt is

function gridNoise(x, z, seed) {
    var n = (1619 * x + 31337 * z + 1013 * seed) & 0x7fffffff;
    n = BigInt((n >> 13) ^ n);
    n = n * (n * n * 60493n + 19990303n) + 1376312589n;
    n = parseInt(n.toString(2).slice(-31), 2);
    return 1 - n / 1073741824;
}

function test() {
    for (var i = 10000; i < 11000; i++) {
        console.log(gridNoise(0, 0, i));
    }
}
test();

Note, the 60493n is BigInt notation

There are "big integer" libraries you could use in the interim though - https://github.com/peterolson/BigInteger.js

The following doesn't work and never will ... because a 32bit x 32bit == 64bit ... so you'll lose bits already

I misread the code and though n was only 19 bits (because of the >>13)

If you limit the result of n * n * 60493 to 32bit, (actually, I made it 31bit ... so .. anyway it seems to work OK

function gridNoise(x, z, seed) {
  var n = (1619 * x + 31337 * z + 1013 * seed) & 0x7fffffff;
  n = (n >> 13) ^ n;

  return 1 - ((n * (n * n * 60493 & 0x7fffffff + 19990303) + 1376312589) & 0x7fffffff) / 1073741824;
}

this also works

return 1 - ((n*(n*n*60493 | 0 + 19990303)  + 1376312589) & 0x7fffffff)/1073741824;

That limits the interim result to 32 bit which may or may not be "accurate"

You may need to play around with it if you want to duplicate exactly what c# produces

-2
Community On

Edit (sorry for the poor previous answer): As others stated before is the problem related too your values which are exceeding the size of JS Number. If you have the code working in C#, it might be advisable to offload the functionality to an ASP.NET backend which will handle the calculation an forward the result via some sort of API

0
Jonas Wilms On

To understand whats going on here, one has to examine the JavaScripts number type. It is basically a 53bit integer, that gets left/right shifted using another 11bit integer, resulting in a 64bit number. Therefore if you have a calculation that would result in a 54bit integer, it just takes the upper 53bits, and shifts them left by 1. Now if you do bitwise math on numbers, it will take the lower 32bits. Therefore if an integer is bigger than 84bits, doing bitwise shifting on it will always result in 0. Numbers bigger than 32bits will therefore tend to 0 in JS when doing bitwise operations, while C# always takes the lower 32bits, and therefore the result will be accurate for those 32bits (but larger numbers cannot be represented).

  (2 + 2 ** 53) & (2 + 2 ** 53)  // 2
  (2 + 2 ** 54) & (2 + 2 ** 54) // 0