mozilla / DeepSpeech

DeepSpeech is an open source embedded (offline, on-device) speech-to-text engine which can run in real time on devices ranging from a Raspberry Pi 4 to high power GPU servers.
Mozilla Public License 2.0
25.16k stars 3.95k forks source link

Mention CUDA dependency in README #1798

Closed nh2 closed 5 years ago

nh2 commented 5 years ago

The current README says:

$ pip3 install --upgrade deepspeech-gpu

In both cases, it should take care of installing all the required dependencies.

This isn't the case: If I try it, I get ImportError: libcusolver.so.9.0: cannot open shared object file: No such file or directory, as did

The README should mention that CUDA (and which version, and which sub-projects) is a dependency.

lissyx commented 5 years ago

@nh2 We already mention CUDA-capable devices, it felt clear to us that this implied CUDA to be installed, but obviously it's not the case, so any PR or feedback on painpoints to improve the wording would be welcome :-)

lissyx commented 5 years ago

Though, technically, CUDA is not a dependency we can install, it really depends on the user's setup. The dependencies mentionned in your quote are related to the Python wheel.

nh2 commented 5 years ago

We already mention CUDA-capable devices, it felt clear to us that this implied CUDA to be installed

You mention it only in the training section, which people who just want to use the pretrained model will not read.

And even there it does only mention CUDA devices, not that (which?) parts of the CUDA SDK are required, or which version (e.g. my current understanding is that even with a full CUDA 8 isntallation, it won't work because CuDNN version 9 is required).

Though, technically, CUDA is not a dependency we can install

That is fine, as long as it's documented what the user shall provide as prerequisites, they'll likely do it. But right now it suggests that it truly takes care of installing all the required dependencies, and specifically so under the deepspeech-gpu install, which is quite misleading and makes for a bad first experience.

lissyx commented 5 years ago

We already mention CUDA-capable devices, it felt clear to us that this implied CUDA to be installed

You mention it only in the training section, which people who just want to use the pretrained model will not read.

And even there it does only mention CUDA devices, not that (which?) parts of the CUDA SDK are required, or which version (e.g. my current understanding is that even with a full CUDA 8 isntallation, it won't work because CuDNN version 9 is required).

Though, technically, CUDA is not a dependency we can install

That is fine, as long as it's documented what the user shall provide as prerequisites, they'll likely do it. But right now it suggests that it truly takes care of installing all the required dependencies, and specifically so under the deepspeech-gpu install, which is quite misleading and makes for a bad first experience.

I'm not deying it, I'm open to any suggestion on how to reformulate / where to write that.

lissyx commented 5 years ago

We already mention CUDA-capable devices, it felt clear to us that this implied CUDA to be installed

You mention it only in the training section, which people who just want to use the pretrained model will not read.

And even there it does only mention CUDA devices, not that (which?) parts of the CUDA SDK are required, or which version (e.g. my current understanding is that even with a full CUDA 8 isntallation, it won't work because CuDNN version 9 is required).

Though, technically, CUDA is not a dependency we can install

That is fine, as long as it's documented what the user shall provide as prerequisites, they'll likely do it. But right now it suggests that it truly takes care of installing all the required dependencies, and specifically so under the deepspeech-gpu install, which is quite misleading and makes for a bad first experience.

Right, obviously we need better doc around there, because it's the same deps as upstream TensorFlow: CUDA 9.0 and CuDNN v7.2. Though, I'm not sure how / where to document that in an efficient manner.

lissyx commented 5 years ago

@nh2 Even smarter than just documenting it, I was thinking maybe we could catch the ImportError and others on the Python / NodeJS code and expose a better error message.

I'm still not sure where would be the easiest place to find that in the documentation, so I'm open to your feedback.

nh2 commented 5 years ago

@lissyx This looks great, thank you!

lock[bot] commented 5 years ago

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.