Closed adriens closed 4 months ago
:pray: @mikeesto for the code suggestion :smile_cat:
This would close #29.
Worth mentioning that there are a few ways to do this. For example, you might also add something like:
## Local Ollama Setup
You need to have a local ollama server running to be able to continue. To do this:
- Download: https://ollama.com/
- Run an LLM: https://ollama.com/library
- Example: `ollama run llama2`
- Example: `ollama run llama2:70b`
:pray: @connor-makowski
Big thanks @adriens
Big thanks @adriens
That was a very modest contribution, but it's always very cool to get feedbacks from projects you really enjoy a lot working with :star_struck: @haesleinhuepf
Dear maintainers,
this is a very useful documentation update. Consider merging it. 😉
:partying_face:
Referencing this to: https://github.com/ollama/ollama-python/pull/155 as an extension for adding clarity to the getting started process.
:grey_question: About
While trying to play with
ollama-python
on a brand new env, I found out that following the current steps of the README do not take in account thatollama
must be up and running (ie. that it's notollama-python
role).:dart: Objective
The purpose of this PR is to make the Getting started even easier to people can just copy/paste and enjoy :smile: