Closed rigille closed 1 month ago
Thank you for solving my puzzle.
I'll make your work a new benchmark into this project. So more people will know the truth.
Now,my question is how bun run such fast??? Command being timed: "scheme --optimize-level 3 --script ./src/sumfp/sumfp-ignore-setuptime.scm" User time (seconds): 1.29 Command being timed: "node ./src/sumfp/sumfp-ignore-setuptime.js" User time (seconds): 1.42 Command being timed: "bun run ./src/sumfp/sumfp-ignore-setuptime.js" User time (seconds): 0.35
Looks like bun does have tail-call optimization. Also one thing that certainly can mess up performance in floating point operations is if they are not locally unboxed or have runtime type-checks. But we can only know that having a look at the machine code that is generated https://www.onsclom.net/posts/javascript-tco
The results of the sumfp benchmark are pretty unintuitive. Chez does tail-call optimization while Node and Bun don't. Why then does Chez take longer to run? It's all startup time.
src/sumfp/sumfp.scm
Output:
src/sumfp/sumfp.js
Output:
Startup time doesn't matter much if the program takes a few seconds to run, but if it takes milliseconds then it can make a big difference.