-
Is 1T version basically V1? If so, is the HF version of V1 (1T) already deduplicated are ready to be used?
-
# Finetuning Redpajama (OpenLlama)
Finetuning Redpajama
[https://www.storminthecastle.com/posts/finetune_redpajama/](https://www.storminthecastle.com/posts/finetune_redpajama/)
-
hi! I am a undergraduate student who interested in your team's project. When I run demo code in EE-Tuning part, I found there are no tuning scripts for llama2-7b model (truly provided for 13b and 70b)…
-
https://github.com/togethercomputer/redpajama.cpp
https://www.together.xyz/blog/redpajama-models-v1
-
- [ ] download script
- [ ] preprocess
- [ ] binarize for megatoron
-
Creating a French Llama version by translating RedPajama dataset
-
Hi,
I was trying to download redpajama_v1/book, but the following link corresponding to this dataset is unavailable:
```
https://data.together.xyz/redpajama-data-1T/v1.0.0/book/book.jsonl
```
-
from open_flamingo import create_model_and_transforms
model, image_processor, tokenizer = create_model_and_transforms(
clip_vision_encoder_path="ViT-L-14",
clip_vision_encoder_pretrained=…
-
Hello, I'm processing redpajama data and it's unacceptably slow, especially processing book domain, any suggestions please?
Or can you share a copy of your processed training data, thanks a lot!
-
```
model, image_processor, tokenizer = create_model_and_transforms(
clip_vision_encoder_path="ViT-L-14",
clip_vision_encoder_pretrained="openai",
lang_encoder_path=model_p…