I have the following deterministic noise function which I've been using in a C# and C++ terrain generator for a while:
float GridNoise(int x, int z, int seed)
{
int n = (1619*x + 31337*z + 1013*seed) & 0x7fffffff;
n = (n >> 13) ^ n;
return 1 - ((n*(n*n*60493 + 19990303) + 1376312589) & 0x7fffffff)/(float)1073741824;
}
It returns a 'random' float between 1 and -1 for any integer x/z coordinates I enter (plus there's a seed so I can generate different terrains). I tried implementing the same function in Javascript, but the results aren't as expected. For small values, it seems OK but as I use larger values (of the order of ~10000) the results are less and less random and eventually all it returns is 1.
You can see it working correctly in C# here, and the incorrect JS results for the same input here.
I suspect it's something to do with JS variables not being strict integers, but can anyone shed more light? Does anyone have a similarly simple deterministic function I could use in JS if this doesn't work?
I'm afraid your code is exceeding the maximum size limit for integers. As soon as that happens, it's returning 1 because the calculation
((n*(n*n*60493 + 19990303) + 1376312589) & 0x7fffffff)/1073741824
will always be 0 - thus 1 - 0 = 1