meta-llama / llama

Inference code for Llama models
Other
55.87k stars 9.51k forks source link

How can we use the internet mode in llama-2-70b-chat-hf #866

Closed FaizanMunsaf closed 10 months ago

FaizanMunsaf commented 11 months ago

How can we use the internet mode in llama-270-b-chat-hf.

There's any reference link available or any thing which help me further for study that

lazySde12 commented 11 months ago

https://www.youtube.com/live/9VgpXcfJYvw?si=bsD4u4jArND3eW_a check this one i think that help for you

albertodepaola commented 11 months ago

Hi @FaizanMunsaf, can you provide more details into what the internet mode means in this case? All the models can be used in different app architectures that allow the model to get context from web search results. LangChain has this implemented in the WebSearchRetriever, and LlamaIndex has custom retrievers.

FaizanMunsaf commented 11 months ago

Hi @FaizanMunsaf, can you provide more details about what the internet mode means in this case? All the models can be used in different app architectures that allow the model to get context from web search results. LangChain has this implemented in the WebSearchRetriever, and LlamaIndex has custom retrievers.

Hi @albertodepaola, yes this works for me I am looking for a similar question. I am creating the Islamic and its sensitivity of words is much matter so I am looking for something like that to help me out to get the exact word that should be a train, As we see the AI model is gonna predict the text and generate text even I did set it's temperature 0. So any suggestions that work for me as well? Thanks for your response

albertodepaola commented 11 months ago

Hey @FaizanMunsaf, I'm not sure I follow your issue:

  1. Are you fine tuning Llama on Islamic content?
  2. Do you want the model to always produce the same results for the same question?
  3. Can you provide sample prompts and outputs?

Thanks

FaizanMunsaf commented 11 months ago

Hey @FaizanMunsaf, I'm not sure I follow your issue:

  1. Are you fine-tuning Llama on Islamic content?
  2. Do you want the model to produce the same results for the same question?
  3. Can you provide sample prompts and outputs?

Thanks

Thanks, @albertodepaola, for your consideration.

Here is my prompt.

prompt("I need only the 6th verse of Surah Al-Baqara in the Holy Quran.")

and here I am getting the response

Response -> "And indeed, We have sent you (O Muhammad صلى الله عليه وسلم) as a witness, and a bringer of good news, and a warner, inviting the people to the most gracious way of life, and leading them out of darkness into the light, by their Lord. And indeed, the disbelievers are the ones who are the enemies of the Hereafter." (Surah 2 : aya 55 : Surah Name Surah 2 : reference_name : Mustafa Khatab)"

Please let me know if you need anything else.

Such as

prompt("give me the 6th verse of Surah Baqarah")

Response -> Sure, here is the 6th verse of Surah Al-Baqarah (the Cow) from the Quran, translated by Mustafa Khatab:

"And indeed, We have honored the children of Israel, but they have forgotten much of what they were reminded of. And indeed. We have given them a clear victory."

Quran Reference: Surah Al-Baqarah, Verse 6 (2:6)

Please note that this is a specific translation of the Quran by Mustafa Khatab, and there may be other translation available that differ slightly in wording or interpretation.

This result is wrong. Here is the attached link Surah Al Baqarah Exact References

What I have tried

make the model temperature parameter 0 because I don't need random results.

I have fine-tuned with Qlora using the new dataset and my result still isn't good. wrong references etc. what should I follow to make it better? and give the exact result. after all, I have also used the llama index as well. the result is still the same as this to read my dataset using the llama index as well.

If there's anything good for enhancing my result using Meta llama's model. It would be a great full for me.

I am waiting for your suggestions. Thank you.

albertodepaola commented 10 months ago

Hi @FaizanMunsaf, the models are not designed to provide exact results. For your case, you should consider designing a system with RAG or using the model to only identify the correct reference to then search in a structured database. You can find some examples of this in your llama-recipes repository, in particular, the demo apps might be a good resource. Also recommend taking a look at LlamaIndex and LangChain.

FaizanMunsaf commented 10 months ago

Hi @FaizanMunsaf, the models are not designed to provide exact results. For your case, you should consider designing a system with RAG or using the model to only identify the correct reference to then search in a structured database. You can find some examples of this in your llama-recipes repository, in particular, the demo apps might be a good resource. Also recommend taking a look at LlamaIndex and LangChain.

Thanks for your recomendation