Closed rakataprime closed 1 month ago
Thanks @rakataprime - as discussed on discord:
Hey @rakataprime ! Please, add info in root README.md about this template.
@Dimokus88 @anilmurty This should be good to go. I have made changes to SDL and added a troubleshooting guide. Also someone else created and merged a lower quality PR for vllm a few days after after this pr was opened #521. It might be better to merge that example into a single folder rather than have two separate ones.
@anilmurty can you merge this in now ?
adds vllm blog and deployment examples. vllm is a multiuser llm server suitable for llm services with lots of concurrent users . @anilmurty