Open ic-mw opened 1 week ago
edit: I'm sorry, the real problem seems to be gruut
Because no versions of coqui-tts match >0.24.1,<0.25.0 and coqui-tts (0.24.1) depends on gruut[de,es,fr] (2.2.3), coqui-tts (>=0.24.1,<0.25.0) requires gruut[de,es,fr] (2.2.3). (1) So, because gruut[de,es,fr] (2.2.3) depends on networkx (>=2.5.0,<3.0.0), coqui-tts (>=0.24.1,<0.25.0) requires networkx (>=2.5.0,<3.0.0).
Yes: https://github.com/rhasspy/gruut/blob/e15b695ec0e5edd6352fd8f7647c449ed5f4d3ec/requirements.txt#L6
I've already opened https://github.com/rhasspy/gruut/pull/48 there to add Numpy 2 support and can extend it to networkx. Hope it will get merged.
@ic-mw gruut 2.4.0 was released and removed the upper bounds for numpy and networkx. However, coqui-tts 0.24.1 still has the old gruut version pinned. I opened #56 to update it, but there are still a few remaining issues with Numpy 2 support in dependencies. I'll wait a bit longer before merging to see if they get resolved soon, there are a few other updates I want to make before the next coqui-tts release anyway. But feel free to install from that branch in the meantime.
excellent, thank you. we'll try out the latest changes.
π Feature Description
I am working on a project which uses llama-index-core, but it depends on a version of networkx which is incompatible with coqui-tts.
and llama-index-core (0.10.51) depends on networkx (>=3.0), llama-index-core (>=0.10.37.post1,<0.11.0) requires networkx (>=3.0). And because llama-index-llms-azure-openai (>=0.1.5,<0.2.0) requires networkx (>=3.0) or llama-index-core (>=0.10.37.post1,<0.11.0) (5), llama-index-llms-azure-openai (>=0.1.5,<0.2.0) requires networkx (>=3.0) And because coqui-tts (>=0.24.1,<0.25.0) requires networkx (>=2.5.0,<3.0.0) (1),
Solution
Could you please consider supporting the current version of networkx?