Closed wwfcnu closed 11 months ago
Can I use CPU for inference?
Thanks for your interest in our work. In our internal scripts, parameters are sent to GPU by default, so I don't think it can cun on CPU for now. If you need to run on CPU, you may look into our codebase and make slight modifications as you need.
Can I use CPU for inference?
Thanks for your interest in our work. In our internal scripts, parameters are sent to GPU by default, so I don't think it can cun on CPU for now. If you need to run on CPU, you may look into our codebase and make slight modifications as you need.
It would be convenient if it could be specified in the inference code
Can I use CPU for inference?
Thanks for your interest in our work. In our internal scripts, parameters are sent to GPU by default, so I don't think it can cun on CPU for now. If you need to run on CPU, you may look into our codebase and make slight modifications as you need.
It would be convenient if it could be specified in the inference code
Thanks for your suggestion. But using LLM I strongly recommend using GPU since the processing is very computationally intensive. CPU may not afford it and take a long time.
Can I use CPU for inference?