ruecat / ollama-telegram

🦙 Ollama Telegram bot, with advanced configuration
MIT License
289 stars 80 forks source link

[Insert the most recent Uk political news headlines here] #42

Closed Roforum closed 5 months ago

Roforum commented 7 months ago

Hi, I ask to give the last Uk political news and give something like this, [Of course! Let's take a look at the latest political news from UK:

[Insert the most recent Uk political news headlines here]

Please provide the specific area you would like me to focus on, and I will be happy to assist you further.]

How and where i can edit/customize the answer or options to access the web and search if necessary, etc.

Very good job and many thanks.

lnrssll commented 6 months ago

Not to speak for the maintainers, but this is likely far out of scope for this project.*

LLMs do not, by themselves, access the web as ChatGPT etc. do -- that is a totally separate feature from the model itself, which only "knows" information it was trained on (not at all current information, generally). In generally, you'll be best off only asking your local ollama models factual questions whose answers are persistent over time, e.g. "what was [historical event]?", "who was [historical figure]?", "what is the weather typically like in the UK?"

That said, if you would like your ollama models to have access to news, specifically, you could feed that information to the model context (as a prompt) yourself. I'd suggest you start by creating an account at one of the many news API providers (e.g. Bing News Search API via Azure, NewsAPI, NYT API) and incorporate the response into your queries (note that this will significantly slow down all of your queries if you are not particular about when it is given as context). You could later incorporate that into your local clone of the ollama-telegram repo.

Here's an example of using one of the news APIs (you can create a few Azure account to get an API key): https://github.com/microsoft/bing-search-sdk-for-python/blob/main/samples/rest/BingWebSearchV7.py

you could incorporate your response in python like so:

import ollama
# ...

user_query = {
    'role': 'user',
    'content': "Tell me about the latest uk political news"
}

messages = [
    {
        'role': 'system',
        'content': news_response
    },
    user_query
]

ollama_response = ollama.chat(
    model='llama3',
    messages=messages
)

print(response['message']['content'])

Good luck!