Closed Foxman13 closed 4 years ago
Sounds good but what support are you thinking about? Windows containers, HyperV containers or Docker for Windows? As a first step I think Docker for Windows would be the easiest, if we could enable HyperV DDA through Docker and passthrough the GPU devices, porting nvidia-docker should be straightforward.
Also, be aware that we will be publishing nvidia-docker 2.0 based on libnvidia-container soon.
Actually it would be great to at least deploy containers using nvidia-docker from Windows to a remote docker-machine that runs on a supported hardware/software. Is that sounds possible?
@MrZoidberg Yes it will be supported with nvidia-docker 2.0
The reason supercomputers use Linux is that the windows client license on 10,000 nodes gets pricey! There are also tradiationally a bunch of better tools to manage Linux clusters.
Still from a dev perspective supporting Windows too would be nice
I'm looking forward to see it
Any updates on this? Would love to see this happen.
Ping, this would also be quite useful for TensorFlow.
Ping, this would also be quite useful for TensorFlow.
@gunan I was thinking the same thing, though would it work, since tensorflow docker containers have to run in a vm on windows right?
I do not think a VM is required, as long as you have the appropriate version of windows. I am thinking this could be a good solution to tackle the cuda/cudnn installation problems majority of our community is running into, and quickly start TF.
Please let me know if you ever figure that out. I'm an artist who taught themselves how to use the CLI specifically so I could mess around with DeepDream and DeepStyle. I figured stuff out, but I've been working on a series of tutorial articles targeting artists to explain how to use neural networks, but any ways that are easier that I could latch on to and spread would be very welcome. Also, I've now got a dedicated Linux desktop but my laptop is still Windows and it's more powerful so it's a shame not to use it! (also can't dual boot it because it's a Surface Book 2 and the kernel stuff has issues with the gpu throttling built in 😿 )
Any update on this? :)
I'm just wondering, but has anyone been working with Microsoft to know if WSL2 brings us any closer to having access to the GPU in docker on windows?
The example shows plain docker running, which is a big improvement over WSL1, however I don't see how this helps give hardware access to the GPU...
Bump
https://devblogs.microsoft.com/commandline/wsl-2-post-build-faq/
At least they specifically mention GPU and then say more hardware support is high on the list (although leaving GPU being in that list a little vague). That's promising at least. (for lcow)
https://docs.microsoft.com/en-us/virtualization/windowscontainers/deploy-containers/gpu-acceleration Is also a little interesting, until you see the note saying "For DirectX only", which afaik == no cuda, but maybe == opengl off screen rendering? (for wcow)
For TF, I am actually more interested in having dev docker containers with (windows on) docker on windows, so that I can build TF on these containers.
Sadly nvidia-docker doesn't work on WSL2...even after risking my machine to move to Fast track releases. WSL2 runs a VM, whereas WSL ran emulation layer. Not sure which is worse....
@ianferreira Yeah, I'm not a fan of the VM approach, but its the direction they chose... And today there's no way a GPU would work in that. My only hope is that since they even mention GPU support that maybe they plan on adding something like... I don't know, a GPU proxy feature that will eventually allow GPU/CUDA commands to be passed to a real GPU from their VM *shrugs*. GPU passthrough has not been a realistic alternative in my experience. (Only works with a small subset of cards, drivers, etc... And you lose access to that card from the host, so no good for people with one card, like most laptops)
For docker for windows, can this apply? https://techcommunity.microsoft.com/t5/Containers/Bringing-GPU-acceleration-to-Windows-containers/ba-p/393939
It mentions directx, but could it be used for cuda?
@cwilhit , author of the previous post. Craig, is it possible to run cuda applications on windows inside a container a possibility? If not, should we expect this to be possible in the future?
@gunan running Nvidia GPU-dependent workloads in Windows containers or Linux containers on Windows (LCOW) is not something that can be done today. Can you help clarify for you: is the interest in Windows containers with a CUDA-based app, or in running a Linux container on Windows or via WSL, which itself would need to use the GPU?
@rick-man FYI
Thanks for the response Craig. I am a member of the TensorFlow team. While this issue seem to focus on linux more, in my case I am interested in running a windows container docker container that can build and run tensorflow with GPU support anywhere.
@cwilhit I think being able to run linux containers in WSL2 that has CUDA support would be great. Second would be windows containers with CUDA support. Most of the docker momentum is on the Linux side, so former would be huge. Basically removes the one last thing that MacOS has on Windows. Has MSFT consider the reverse, running linux kernel with win32 emulation? Just saying :)
@cwilhit I can say that many companies and a huge research community using GPU accelerated code that is based on CUDA, not DirectML (such as Tensorflow, caffe, pytorch, etc...). We run code on Windows AND Linux, and CUDA is common to both of them on NVIDIA hardware. (Ignoring arguments like DirectX vs OpenGL vs OpenCL vs CUDA and Intel vs AMD vs NVIDIA) the state things are in today is a lot of these libraries use CUDA and use NVIDIA graphics cards.
We think it would be a huge deal to be able to say "We can run CUDA accelerated code in a docker on windows". Most of us like the LCOW (Linux Containers on Windows) Method, which WSL2 opens a whole new slew of possibilities, but even being able to run CUDA in WCOW (Windows Containers on Windows) would be a big deal because then we can set up a complicated environment in a container, and have it work!
Thanks for replying!
Feature request+1
Hi Everyone, I am newcomer to evaluate Nvidia Digital using Nvidia docker under Ubuntu 18.04.3; but the Ubuntu 18.04.3 is running on Windows Subsystem for Linux of Windows 10. Could you anyone share information whether or not the Ubuntu 18.04.3 is running on Windows Subsystem for Linux of Windows 10 can support to install docker and Nvidia docker to execute Nvidia Digits for doing Deep Learning? Thanks, Francis
@FrancisLeung No, it cannot. There is no GPU access in WSL1 or WSL2.
Hi guys, Is there any updates for this issue? I'm using Win10 Pro (enabled Hyper-V), and Docker for Windows installed. Any way to use nvidia-docker?
Thanks!
Hello, I came here to cry as it looks like it is not available yet.
https://devblogs.microsoft.com/directx/directx-heart-linux/
Nvidia CUDA is officially supported on WSL from now on.
Stay tuned and please watch this space for more updates on availability
I think there are two aspects of this bug: 1 - nvidia-docker running on WSL, which seems to be made possible with recent updates 2 - nvidia-docker on windows, without WSL, there seems to be no updates on this.
I am trying to enable deepstack ai on wsl2 with kernel 4.19.121-microsoft-standard I am getting
docker: Error response from daemon: OCI runtime create failed: container_linux.go:349: starting container process caused "process_linux.go:449: container init caused \"process_linux.go:432: running prestart hook 1 caused \\\"error running hook: exit status 1, stdout: , stderr: nvidia-container-cli: initialization error: driver error: failed to process request\\\\n\\\"\"": unknown.```
Anyone have any clue on how to get wsl2 and cuda working?
Just some more "news" I found. Statement on the Nvidia website: https://developer.nvidia.com/cuda/wsl
Yes. And if you have access to WSL2, support has been added for it in the latest RC pre-release of libnvidia-container (which is what nvidia-docker relies on under the hood to enable GPU support inside containers).
https://github.com/NVIDIA/libnvidia-container/releases/tag/v1.2.0-rc.1
Feature is up for public preview. Closing this issue. Please open new issues on specific problems you face.
I would like to help add support for Docker for Windows. Filing an issue to track my work.