anakin87 / autoquizzer

Generates a quiz from a URL. You can play the quiz, or let the LLM play it.
https://huggingface.co/spaces/deepset/autoquizzer
Apache License 2.0
48 stars 8 forks source link

Command R+ Support to use the 128k context #1

Open S4lXLV opened 6 months ago

S4lXLV commented 6 months ago

Hey as the title says. can you add Command R+ support?

anakin87 commented 6 months ago

Hello! In its current form, AutoQuizzer is only a nice demonstration.

I invite you to fork the repo and replace the current Generator with another one that is compatible with the desired model.

Some docs:

S4lXLV commented 6 months ago

Thank you. Another question, is it always going to give the same questions and only take the beginning of the webpage as of now, even if I use a bigger model like Command R+? Like as of now, when I use it, it gives the same exact questions from the very beginning of the webpage no matter how many times I run it. I tried to increase the tokens but still the same result. I would like to get random questions from all over the article I am passing. maybe it is possible to achieve this with Groq.

anakin87 commented 6 months ago

As mentioned in the Readme, I am truncating the text to the first 4k characters: in the online version, I do not want to hit Groq rate limits. https://github.com/anakin87/autoquizzer/blob/083f13e3a38e0d5cc9a484937a7469d56405cfa4/backend/pipelines.py#L35

If you are using the project locally, you can safely remove this limit.

S4lXLV commented 6 months ago

Thank you for being patient with me. I am kinda new to all of this. so by removing this line, you mentioned it would go through the full page right? Also do I need to increase the max_tokens here?

generation_kwargs={"max_tokens": 1000, "temperature": 0.5, "top_p": 1},

last question how can I make it generate more than 5 questions at the time? is changing create 5 multiple choice..... enough?

anakin87 commented 6 months ago