Open Hkllopp opened 1 month ago
ok, it could be an idea but please provide me a use case for the system prompt. The output is already in the json format
Well, another example is that users that want to use groq with llama3-70b they need to mention the word JSON in the system-prompt message, its mandatory.
Even if the output format is in json?
Yes, according to the groq playground, the system message should contain the word JSON
El El sáb, 15 de jun de 2024 a la(s) 12:38 a.m., Marco Vinciguerra < @.***> escribió:
Even if the output format is in json?
— Reply to this email directly, view it on GitHub https://github.com/VinciGit00/Scrapegraph-ai/issues/382#issuecomment-2169163134, or unsubscribe https://github.com/notifications/unsubscribe-auth/AADSHIVDNBCXIYAE2PY3ZGLZHPOPNAVCNFSM6AAAAABJKD63P6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRZGE3DGMJTGQ . You are receiving this because you commented.Message ID: @.***>
give me an example please
OpenAI models uses the message parameter for the prompt. ScrapegraphAI also use this parameter to link to the prompt argument on scrapper invocation. However, sometimes when using openAI models, we need to make multiple prompts to better guide the response (like in this article and this documentation).
Is it possible to replace the standart scrapGraphAI prompt when providing message argument in the graph_config ?
Example :