North-Western-Development / oc2r

RISC-V VMs in Minecraft. Continued by North Western Development, originally by Sangar.
https://www.curseforge.com/minecraft/mc-mods/oc2r
Other
69 stars 10 forks source link

The frequency and time measured with the different tiers of CPU does not match their speed displayed in the tooltip #15

Open getItemFromBlock opened 1 month ago

getItemFromBlock commented 1 month ago

Describe the bug While working on my C++ rasterizer program for OC2 I quickly realized that not only the tier 4 CPU's speed did not match the speed expectations, but also that the time measured with it was wrong. My program uses the C time.h library in order to measure render time, and the values measured are different from IRL time. While this is something I had already seen with OC2, this time changing the CPU tier also affects how time passes on the computer. Depending on the operations, time could go between 25-50 times slower, and on the other hand the speed improvement of the CPU tier 4 over the tier 1 seems to vary a lot, but I could only get a maximum of 1.3 speed improvement. It looks like the frequency is affecting the speed of some operations (like writing/reading to the projector framebuffer?) but not the other operations.

To Reproduce files.zip

  1. In singleplayer (not tested on server but I doubt it will make a difference) place a preconfigured computer, and connect a projector to it.
  2. Then download the provided binary files in the above zip, and add execution permission to the rasterizer program chmod +x rasterizer
  3. Poweroff the computer and swap the CPU with a tier 1. Boot the computer again.
  4. Run ./rasterizer ship.bin 5.0, this should display a 3D model on the projector for about 5 seconds.
  5. Poweroff the computer and swap the CPU with a tier 4. Boot the computer one last time.
  6. Run ./rasterizer ship.bin 5.0, and observe that the program is not running that much faster, but the time indicated is also much slower than the previous one. (you can CTRL+C to interrupt the program)

Log files latest.log debug.log Does not seems to contain anything relevant

Expected behavior The time measured should be at least the same between all CPU tiers, and the speed of each CPU should be closer to what they are supposed to be.

Screenshots

Versions

Additional context The time issue is probably related to another bug, even if the CPU speed being incorrect have an effect on it.

hickorysb commented 2 weeks ago

As for the reported frequency not matching that's interesting and likely an issue with how Sedna was written, but I'll have to take a look once I have time again. Likely the time issue is also an issue with how Sedna implements the RTC system. As for the actual performance though this is likely more so a limitation of the host machine than anything. Especially when you consider that the Tier 4 CPU if I recall is by default set to a clock speed that is like 1/3 of the average CPU clock speed now. Most CPUs won't be able to keep up with both interpreting and executing the RISC-V calls in real time at those speeds. Correcting for this fact will require playing with the values to make them make sense for different systems. The best way to handle this would likely be to make it configurable and set the baseline config to max out at something much lower.