LlamaFamily / Llama-Chinese

Llama中文社区,Llama3在线体验和微调模型已开放,实时汇总最新Llama3学习资料,已将所有代码更新适配Llama3,构建最好的中文Llama大模型,完全开源可商用
https://llama.family
14.01k stars 1.26k forks source link

如何设置系统提词? #273

Open tigermask1978 opened 11 months ago

tigermask1978 commented 11 months ago

想用Atom模型做一个RAG,问一下这样设置prompt格式对吗,谢谢!

<s>System: Answer the query using the context provided. Be succinct.\n</s> <s>Human: query: What is the default batch size for map_batches? context: batch_size.Note The default batch size depends on your resource type. If you’re using CPUs,the default batch size is 4096. If you’re using GPUs, you must specify an explicit batch size.The actual size of the batch provided to fn may be smaller than batch_size if batch_size doesn’t evenly divide the block(s) sent to a given map task. Default batch_size is 1024 with “default”. compute – This argument is deprecated. Use concurrency argument.# Specify that each input batch should be of size 2. ds.map_batches(assert_batch, batch_size=2) Caution The default batch_size of 4096 may be too large for datasets with large rows (for example, tables with many columns or a collection of large images).Configuring Batch Size# Configure the size of the input batch that’s passed to __call__ by setting the batch_size argument for ds.map_batches()batch_size=64, shuffle=True) </s> <s>Assistant:

ZHangZHengEric commented 10 months ago

是这个样子