Open gdmcdonald opened 1 year ago
@gdmcdonald Thanks for the comment! I'm not personally familiar with huggingface compute environment but yes, I believe you can run it in huggingface, docker or any other cloud or local virtual environment as long as it has access to an Nvidia GPU. If you are familiar with AWS, I have put together a setup manual for running this tool on AWS EC2. Please take a look at the readme file of https://github.com/yy29/aws-ec2-tips-llm-chat-ai and hope it helps.
Love the look of the tool on the video. Can i run this in huggingface to try it out? or docker? would make it easier for others to see if it works for their case before going through the install.