
Open your favorite actionscript editor and type:
var n:Number=123456789012345672;
trace(n);
// outputs: 123456789012345660
or simply:
trace(123456789012345672);
// outputs: 123456789012345660
now try incrementing the number by one:
trace(123456789012345673);
// outputs: 123456789012345680
oh dear… IEEE-754 is the specification on which the Number class of actionsctipt/javascript (all ecma…) and many others language is based.
IEEE-754 sucks!
Reading some documentation this format should support 53 digits numbers… but seems kinda different.
Ok ok those were high numbers… but try this:
trace(0.1*3);
// outputs: 0.30000000000000004
trace(0.1*6);
// outputs: 0.6000000000000001
trace(0.1*9);
// outputs: 0.9
trace(0.1*12);
// outputs: 1.2000000000000002
trace(0.1*15);
// outputs: 1.5
I was thinking to use actionscript to collaboratively solve some of Project Euler problems… just for fun, but I don’t think I will waste just a single line of code on this from now on.
This really disappoints me, I mean, if computers cannot either do math, why the hell am I still sitting here?
