-
**Describe the bug**
VS code extension features not work, but terminal chat work.
![image](https://github.com/rjmacarthy/twinny/assets/45714701/7067004e-a841-4319-bf63-a00600ccae77)
Throw `F…
-
Hi, I was recently trying VS code with the [Continue](https://continue.dev/) Plugin, configured to use my own OLLAMA server and LLMs (https://ollama.ai) and was amazed how well this works.
I'm no…
-
### Reminder
- [X] I have read the README and searched the existing issues.
### System Info
之前发了个issue直接关掉了,没懂这块为啥结果突然变成0.068。我感觉圈的挺明白了,最后的training_loss.png输出没问题,不懂为啥日志会突然变大。。
![image](https://git…
-
### Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the [Continue Discord](https://discord.gg/NWtdYexhMs) for questions
- [X] I'm not able to find an [open issue](ht…
-
While fine-tuning the unsloth/codellama-7b model using transformers v4.40.1 and setting save_strategy=epoch, I encountered the following error:
```python
line 540, in LlamaModel_fast_forward
in…
-
-
from #13: let's add a `--bootstrap` flag for setup of the initial state of fake GitHub API
-
**Describe the bug**
When attempting to follow the [FastChat OpenAI API](https://continue.dev/docs/walkthroughs/codellama#fastchat-api) setup instructions, I get the following error: `openai.error.AP…
-
Am using the same config file given in the recipe to finetune the CodeLlama-2 model. But am getting a strange error attached in the below SS.
![gitissueimage](https://github.com/pytorch/torchtune/as…
-
Hello I am a complete noob so I don't know if I have provided enough informations to be helped. but I need help in this please
# Prerequisites
Please answer the following questions for yourself …
adouc updated
10 months ago