FasterDecoding / Medusa

Medusa: Simple Framework for Accelerating LLM Generation with Multiple Decoding Heads
https://sites.google.com/view/medusa-llm
Apache License 2.0
2.21k stars 150 forks source link

Containerization with Dockerfile to setup medusa #104

Open gangooteli opened 4 months ago

gangooteli commented 4 months ago

@ctlllll Please provide Dockerfile for Medusa . Lots of error resolving while doing the setup. It would be good to have containerized environment which supports training and inference both.

Its very hard to setup Medusa.

Let me know if I can help in this. I am almost there but not able to run it successfully.