meta-llama / codellama

Inference code for CodeLlama models
Other
15.87k stars 1.84k forks source link

Unable to run it under Windows 10 #40

Open eranif opened 1 year ago

eranif commented 1 year ago

I followed the instructions, and I was unable to run it under Windows 10 due to nccl

GaganHonor commented 1 year ago

or i guess yiu should use VPC , no need to run on your system just go to AWS and use it :) still this may help 👇

import os
import torch
os.environ['PL_TORCH_DISTRIBUTED_BACKEND'] = 'gloo'
os.environ['NCCL_DEBUG'] = 'INFO'
torch.distributed.init_process_group(backend="gloo")
eranif commented 1 year ago

I am not sure that I understand your comment about AWS...

I am actually trying to make it work to see whether I can integrate it into CodeLite IDE... (As a plugin that generates code based on comment in the editor)

Also, I already tried that solution, still does not work

Eran

GaganHonor commented 1 year ago

Apologies for any confusion caused by my previous comment mentioning AWS. I misunderstood your intention to integrate Code Llama into CodeLite IDE. I appreciate your clarification.

Regarding your goal of integrating Code Llama as a plugin in CodeLite IDE to generate code based on comments in the editor, it sounds like an interesting project. However, if you have already tried the suggested solution and it did not work, let's explore alternative approaches to address the issue.

Here are a few suggestions:

Consult the CodeLite community: Reach out to the CodeLite community or forums for assistance in integrating Code Llama as a plugin. They may have specific insights, examples, or documentation on how to achieve this integration successfully. You can ask for guidance on the best practices or any known issues related to plugin development in CodeLite IDE.

Check for compatibility: Ensure that Code Llama is compatible with the version of CodeLite IDE you are using. Verify if there are any specific requirements or considerations for integrating plugins in CodeLite IDE and ensure that Code Llama meets those requirements.

Review the plugin development documentation: Look for documentation or tutorials related to developing plugins for CodeLite IDE. Check if there are any guidelines or examples that can help you understand the process better. Pay attention to any specific steps or configurations required for integrating external tools like Code Llama.

Engage with the CodeLite community: Consider engaging with the CodeLite community directly. You can share your specific integration goal and ask for assistance or guidance from experienced users or the project maintainers. They might be able to provide valuable insights or suggestions based on their expertise and experience.

Remember to provide detailed information about any error messages or specific issues you encounter during the integration process. This will assist others in understanding the problem better and offering more targeted solutions.

I hope these suggestions help you in your journey to integrate Code Llama into CodeLite IDE. If you have any further questions or need additional assistance, please feel free to ask. Good luck with your project!

source : gitlab community

For more help try here buddy ✅ https://aws.amazon.com/blogs/machine-learning/llama-2-foundation-models-from-meta-are-now-available-in-amazon-sagemaker-jumpstart/

And share in more detail if possible ❇️

and in last i found this that may be helpful for you , https://huggingface.co/blog/codellama#conversational-instructions , great post available. Thanks for reading

eranif commented 1 year ago

Thanks for the tips. I have pretty much the idea in my head and I think its achievable. Regarding CodeLite IDE: I don't need to consult CodeLite's docs, since I wrote CodeLite and I also wrote the docs ;)

Eran

GaganHonor commented 1 year ago

Sir @eranif 🙏 I apologize for the inconvenience while I have not that much knowledge and taken your time , and I am very happy to know that I got a chance to chat with creator of #CodeLITE 🌟

Gagan