JS's Number type is 64bit float. For integers, it is only precise up to 15 digits (~51 bits). When I enter y =
9223372036854775807 (high of int64) directly in browser console, it (per spec) looses quite a bit of precision and gives me
In addition to what Araq says, JS represents all numbers internally as double precision floating point numbers. This means that you can represent integers up to 53 bits without loss of accuracy. The exception is that bit operations use only 32 bits. This can lead to interesting surprises, such as (running the following in node):
> x = 1 << 30;
1073741824
> x & x;
1073741824
> x = x * 1024;
1099511627776
> x & x;
0
Working around that limitation is possible (that's what JS big integer libraries do), but is rarely worth the trouble.