mit-han-lab / TinyChatEngine

TinyChatEngine: On-Device LLM Inference Library
https://mit-han-lab.github.io/TinyChatEngine/
MIT License
751 stars 73 forks source link

metal gpu matrix3D addition test #87

Open DerrickYLJ opened 10 months ago

DerrickYLJ commented 10 months ago

In the process of adding metal gpu support, created a folder named metal for element-wise addition of Matrix3D test.

DamianB-BitFlipper commented 9 months ago

Hi! Thanks for the work on this. What is the status of getting this model working on Apple Metal?

I try to make it using: USE_METAL=1 make chat however it errors saying: 'ops/metal/BMM_F16T.cuh' file not found. It looks like the includes ops/metal/ are not a part of this branch. Is this still a work in progress?

DerrickYLJ commented 9 months ago

Hi! Thanks for the work on this. What is the status of getting this model working on Apple Metal?

I try to make it using: USE_METAL=1 make chat however it errors saying: 'ops/metal/BMM_F16T.cuh' file not found. It looks like the includes ops/metal/ are not a part of this branch. Is this still a work in progress?

Thanks for your interest! We are still working on Metal GPU support for TinyChatEngine and will be available soon :)