I'm using a node.js client application to fuzz a remote server.
I can ~100% of the time crash the remote server when using Math.random in the node.js client. However, I've since tried a couple of deterministic, seeded random number generators, and neither are able to crash it.
I suspect its due to an idiosyncrasy of Math.random or the seeded generators I've tested.
This is one of the seeded generators I've tried:
var x = 123456789, y = 362436069, z = 521288629, w = 88675123;
function random() { // See http://stackoverflow.com/a/6275875
var t;
t = x ^ (x << 11);
x = y; y = z; z = w;
return (w = w ^ (w >> 19) ^ (t ^ (t >> 8)))/(4294967296/2);
}
In what way will the output be different to Math.random()? Also, why does 2^32, 4294967296, need to be divided by 2?
Math.random uses an implicit engine specific algorithm which has no seeding capability, whereas your implementation is platform independent and explicitly seeded.
It is used to ensure a multiple of two, so the random number generator does not fail on an odd number.
References