shansongliu / MU-LLaMA

MU-LLaMA: Music Understanding Large Language Model
GNU General Public License v3.0
221 stars 16 forks source link

cpu inference #16

Closed wwfcnu closed 11 months ago

wwfcnu commented 11 months ago

Can I use CPU for inference?

shansongliu commented 11 months ago

Can I use CPU for inference?

Thanks for your interest in our work. In our internal scripts, parameters are sent to GPU by default, so I don't think it can cun on CPU for now. If you need to run on CPU, you may look into our codebase and make slight modifications as you need.

wwfcnu commented 11 months ago

Can I use CPU for inference?

Thanks for your interest in our work. In our internal scripts, parameters are sent to GPU by default, so I don't think it can cun on CPU for now. If you need to run on CPU, you may look into our codebase and make slight modifications as you need.

It would be convenient if it could be specified in the inference code

shansongliu commented 11 months ago

Can I use CPU for inference?

Thanks for your interest in our work. In our internal scripts, parameters are sent to GPU by default, so I don't think it can cun on CPU for now. If you need to run on CPU, you may look into our codebase and make slight modifications as you need.

It would be convenient if it could be specified in the inference code

Thanks for your suggestion. But using LLM I strongly recommend using GPU since the processing is very computationally intensive. CPU may not afford it and take a long time.