tairov / llama2.mojo

Inference Llama 2 in one file of pure 🔥
https://www.modular.com/blog/community-spotlight-how-i-built-llama2-by-aydyn-tairov
MIT License
2.09k stars 140 forks source link

#+-13-20% speed up making steps as an alias, and using it in a range #40

Closed rd4com closed 11 months ago

rd4com commented 11 months ago

Hello, i am new to github do you have the same results ?

tairov commented 11 months ago

Hi @rd4com . Thank you for sharing this interesting idea. I’m not certain if it’s related to alias steps, but sometimes I observe a difference in executions within a 10% margin. Maybe it's somehow related to the state of target VM I'm using for tests

rd4com commented 11 months ago

:vulcan_salute: