I wrote an implementation of the fibonacci sequence in ArnoldC and found that numbers over 2^15 were represented properly, but numbers over 2^31 went round the number wheel and into negative. So it appears that the integer type would be a signed 32 bit integer instead of a signed 16 bit integer.
Hi, nice project!
I wrote an implementation of the fibonacci sequence in ArnoldC and found that numbers over 2^15 were represented properly, but numbers over 2^31 went round the number wheel and into negative. So it appears that the integer type would be a signed 32 bit integer instead of a signed 16 bit integer.
Links:
The only variable type in ArnoldC is a 16bit signed integer
: https://github.com/lhartikk/ArnoldC/wiki/ArnoldC