souzatharsis / podcastfy

An Open Source alternative to NotebookLM's podcast feature: Transforming Multimodal Content into Captivating Multilingual Audio Conversations with GenAI
https://www.podcastfy.ai
Other
513 stars 54 forks source link

Error when installing #50

Open GodAtum360 opened 5 days ago

GodAtum360 commented 5 days ago

When trying to install (pip install podcastfy) I get the error:

     ../meson.build:1:0: ERROR: Unknown compiler(s): [['cc'], ['gcc'], ['clang'], ['nvc'], ['pgcc'], ['icc'], ['icx']]
      The following exception(s) were encountered:
      Running `cc --version` gave "[Errno 2] No such file or directory: 'cc'"
      Running `gcc --version` gave "[Errno 2] No such file or directory: 'gcc'"
      Running `clang --version` gave "[Errno 2] No such file or directory: 'clang'"
      Running `nvc --version` gave "[Errno 2] No such file or directory: 'nvc'"
      Running `pgcc --version` gave "[Errno 2] No such file or directory: 'pgcc'"
      Running `icc --version` gave "[Errno 2] No such file or directory: 'icc'"
      Running `icx --version` gave "[Errno 2] No such file or directory: 'icx'"

I am running Python 3.13 on Ubuntu 20.

souzatharsis commented 4 days ago

That's interesting... This error message suggests that Meson, a build system, is unable to find a C compiler on your system.

Have you tried:

sudo apt update sudo apt install build-essential

GodAtum360 commented 4 days ago

Thanks, I had to run these:

sudo apt install build-essential sudo apt install python-dev sudo apt-get install python3.13-dev

Now when running python -m podcastfy.client --help, I get the error: `Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "/home/christopher/env/lib/python3.13/site-packages/podcastfy/client.py", line 12, in from podcastfy.content_generator import ContentGenerator File "/home/christopher/env/lib/python3.13/site-packages/podcastfy/content_generator.py", line 13, in from langchain_google_genai import ChatGoogleGenerativeAI File "/home/christopher/env/lib/python3.13/site-packages/langchain_google_genai/init.py", line 58, in from langchain_google_genai._enums import HarmBlockThreshold, HarmCategory File "/home/christopher/env/lib/python3.13/site-packages/langchain_google_genai/_enums.py", line 1, in import google.ai.generativelanguage_v1beta as genai File "/home/christopher/env/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/init.py", line 21, in from .services.cache_service import CacheServiceAsyncClient, CacheServiceClient File "/home/christopher/env/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/services/cache_service/init.py", line 16, in from .async_client import CacheServiceAsyncClient File "/home/christopher/env/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/services/cache_service/async_client.py", line 50, in from google.ai.generativelanguage_v1beta.services.cache_service import pagers File "/home/christopher/env/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/services/cache_service/pagers.py", line 41, in from google.ai.generativelanguage_v1beta.types import cache_service, cached_content File "/home/christopher/env/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/types/init.py", line 47, in from .discuss_service import ( ...<7 lines>... ) File "/home/christopher/env/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/types/discuss_service.py", line 22, in from google.ai.generativelanguage_v1beta.types import citation, safety File "/home/christopher/env/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/types/safety.py", line 224, in class SafetySetting(proto.Message): ...<48 lines>... ) File "/home/christopher/env/lib/python3.13/site-packages/proto/message.py", line 279, in new file_info.generate_file_pb(new_class=cls, fallback_salt=full_name)


  File "/home/christopher/env/lib/python3.13/site-packages/proto/_file_info.py", line 104, in generate_file_pb
    pool.Add(self.descriptor)
    ~~~~~~~~^^^^^^^^^^^^^^^^^
TypeError: Couldn't build proto file into descriptor pool: duplicate symbol 'google.ai.generativelanguage.v1beta.__firstlineno__'`
souzatharsis commented 3 days ago

This error suggests a conflict in the Google AI Generative Language library. This is likely due to incompatible versions of dependencies or a problem with the installation.

Please try

pip install --upgrade google-ai-generativelanguage langchain-google-genai podcastfy

GodAtum360 commented 3 days ago

Unofrtunately still get the same error.

adambwells commented 1 day ago

I also get what looks like an identical error, running Python 3.13 on a Mac (with Sequoia 15.0.1, build 24A348).

$ python -m podcastfy.client --help
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/podcastfy/client.py", line 14, in <module>
    from podcastfy.content_generator import ContentGenerator
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/podcastfy/content_generator.py", line 12, in <module>
    from langchain_google_genai import ChatGoogleGenerativeAI
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/langchain_google_genai/__init__.py", line 58, in <module>
    from langchain_google_genai._enums import HarmBlockThreshold, HarmCategory
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/langchain_google_genai/_enums.py", line 1, in <module>
    import google.ai.generativelanguage_v1beta as genai
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/__init__.py", line 21, in <module>
    from .services.cache_service import CacheServiceAsyncClient, CacheServiceClient
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/services/cache_service/__init__.py", line 16, in <module>
    from .async_client import CacheServiceAsyncClient
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/services/cache_service/async_client.py", line 50, in <module>
    from google.ai.generativelanguage_v1beta.services.cache_service import pagers
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/services/cache_service/pagers.py", line 41, in <module>
    from google.ai.generativelanguage_v1beta.types import cache_service, cached_content
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/types/__init__.py", line 47, in <module>
    from .discuss_service import (
    ...<7 lines>...
    )
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/types/discuss_service.py", line 22, in <module>
    from google.ai.generativelanguage_v1beta.types import citation, safety
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/google/ai/generativelanguage_v1beta/types/safety.py", line 224, in <module>
    class SafetySetting(proto.Message):
    ...<48 lines>...
        )
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/proto/message.py", line 279, in __new__
    file_info.generate_file_pb(new_class=cls, fallback_salt=full_name)
    ~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adam/Documents/Development/podcastfy/lib/python3.13/site-packages/proto/_file_info.py", line 104, in generate_file_pb
    pool.Add(self.descriptor)
    ~~~~~~~~^^^^^^^^^^^^^^^^^
TypeError: Couldn't build proto file into descriptor pool: duplicate symbol 'google.ai.generativelanguage.v1beta.__firstlineno__'

I tried upgrading the two modules you mentioned above, but it said that everything was up to date, and the error persists.

souzatharsis commented 1 day ago

Thanks for sharing your experience. This further motivates the release of a docker image. There is an already an open issue on it and we should be able to release soon.