Closed ProxyPlayerHD closed 1 year ago
This is probably because you defined the variables to be 32-bit, but int
is 16 bits on cc65816
, so %u
expects a 16 bits value, but 32 bits are pushed. Use %ul
(that is U L) to specify a 32 bit value in the format string.
but that doesn't make a difference in this case. if the number is small enough to fit into a 16-bit int, then only using the bottom 16-bits of a 32-bit variable should be functionally identical to using the whole 32-bit variable. and it does work given by the fact that endTime
is a 32-bit varible and it's getting printed correctly by the Raw Time: %u
part of the printf statement.
on a side note, %ul
doesn't seem to work, it just prints the same value as %u
followed by a lowercase L.
for example Raw Time: %u
prints Raw Time: 99
at the end, while Raw Time: %ul
prints Raw Time: 99l
.
and swapping out the uint32_t
for timeSeconds/timeMinutes/timeHours with uint16_t
or even uint8_t
doesn't fix it either and it still just prints 0 hours 0 minutes 1 seconds
at the very end.
so something does seem to be broken...
It is not functionally identical pushing 16 or 32 bit numbers on the stack. The printf()
function uses a formatter string, the arguments are pushed on the stack with their (default promoted) size. The first value probably works as the 65816 is little endian, the rest will be accessed from the wrong offset. You need to make them match, now the formatter says 16 bits and the value pushed it 32 bits.
There are two ways to fix it. You can either cast the arguments to (int)
, or you change the formatter to be %ul
, but in that case you probably need to use a printf()
formatter with one that supports %ul
, the next step up from the default one should do. This is described in the user guide.
oh i see what you mean, i thought it would automatically cast it to the correct type.... maybe i'm a bit spoiled by the way GCC checks formatted strings during compilation to do casting or throw various warnings about types and such.
oh well, it seems to be working now! thanks!
in case you're interested: to do a 150x50 resolution with 256 Iterations Mandelbrot set, my 20MHz 65816 SBC took: 16 Minutes 57 Seconds using floating point numbers and 5 Minutes 26 Seconds using fixed point numbers
Basically i've got this small piece of code that converts a varible counting seconds into hours/minutes/seconds to then be printed. but it doesn't work as it should.
this is a basic example, where it just counts up to 100 "seconds", at the end when it reaches 99 the output should be:
0 hours 1 minutes 39 seconds
and that is the exact output i get when i use an online C compiler or even my local GCC install. but on my 65816 SBC i get this instead:0 hours 0 minutes 1 seconds
but i don't understand why, it's all 32-bit varibles so there is no way it's overflowing somewhere where an x86 compiler wouldn't. plus i checked the .lst file and it's calling "_UDivMod32" which should be correct. but i'll include the whole .lst file (renamed to .txt so i can upload it) in case you want to check it yourself: test.txt