kturung / streamlit_podcast_generator

27 stars 14 forks source link

Problems installing in Mac #1

Open jordilinarespellicer opened 5 months ago

jordilinarespellicer commented 5 months ago

pip3 install -r requirements.txt Collecting phidata==2.3.90 (from -r requirements.txt (line 1)) Using cached phidata-2.3.90-py3-none-any.whl.metadata (11 kB) Collecting pydantic==2.7.1 (from -r requirements.txt (line 2)) Using cached pydantic-2.7.1-py3-none-any.whl.metadata (107 kB) Collecting streamlit==1.29.0 (from -r requirements.txt (line 3)) Using cached streamlit-1.29.0-py2.py3-none-any.whl.metadata (8.2 kB) ERROR: Ignored the following versions that require a different python version: 0.55.2 Requires-Python <3.5 ERROR: Could not find a version that satisfies the requirement torch==2.2.0+cu121 (from versions: 1.11.0, 1.12.0, 1.12.1, 1.13.0, 1.13.1, 2.0.0, 2.0.1, 2.1.0, 2.1.1, 2.1.2, 2.2.0, 2.2.1, 2.2.2, 2.3.0) ERROR: No matching distribution found for torch==2.2.0+cu121 (podcast)

kturung commented 5 months ago

I'm not a Mac user but have you tried change torch version on requirements.txt to torch==2.2.0 ?

jordilinarespellicer commented 5 months ago

Yes, it worked.

But now, although Ollama is installed, with phi, etc. I have this after streamlit run app.py

ModuleNotFoundError: No module named 'ollama' Traceback: File "/Users/jordilinarespellicer/anaconda3/envs/myenve/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 534, in _run_script exec(code, module.dict) File "/Users/jordilinarespellicer/Projects/streamlit_podcast_generator/app.py", line 10, in from phi.llm.ollama import Ollama File "/Users/jordilinarespellicer/anaconda3/envs/myenve/lib/python3.11/site-packages/phi/llm/ollama/init.py", line 1, in from phi.llm.ollama.chat import Ollama File "/Users/jordilinarespellicer/anaconda3/envs/myenve/lib/python3.11/site-packages/phi/llm/ollama/chat.py", line 13, in from ollama import Client as OllamaClient