luanfujun / deep-photo-styletransfer

Code and data for paper "Deep Photo Style Transfer": https://arxiv.org/abs/1703.07511
9.99k stars 1.41k forks source link

Is it possible to run this code without a GPU? #13

Closed neslinesli93 closed 7 years ago

neslinesli93 commented 7 years ago

Hi, thanks for the project - it's mind blowing to say the least!

I would like to run this script inside a VPS with no GPU and I was wondering how much of the code is tied to CUDA...

zsavajji commented 7 years ago

second this, it would be nice to run that in one or more VPSs

neslinesli93 commented 7 years ago

After digging around the code, it looks like it would be possible to run the code without CUDA support by simply removing CUDA deps, all the :cuda() calls, and by converting cuda tensors to float tensors...

ProGamerGov commented 7 years ago

@Zsavajji @neslinesli93 The code used in deep-photo-styletransfer is based on an old version Neural-Style's code before Dec 2016. Link to the old code: https://github.com/jcjohnson/neural-style/blob/abd4b418ab8eff21537ec49090d902c2e0b3dc2d/neural_style.lua

Neural-Style's current code uses the setup_gpu function, via appending :type(dtype) to where one would add :cuda() in deep-photo-styletransfer's code.

The current version of Neural-Style's code can be found here: https://github.com/jcjohnson/neural-style/blob/master/neural_style.lua

luanfujun commented 7 years ago

Yes the torch implementation is based on that neural style old version code. But it's not easy to move everything to CPU (check out cuda_utils.cu which has our core contribution inside).

ProGamerGov commented 7 years ago

@luanfujun Is cuda_utils.cu used for the automated image segmentation process? And if so, then does neuralstyle_seg.lua require cuda_utils.cu?

luanfujun commented 7 years ago

cuda_utils.cu is for the matting Laplacian regularization and reconstruction. It is used in deepmatting_seg.lua.

neuralstyle_seg.lua actually doesn't requrie it.

ProGamerGov commented 7 years ago

@luanfujun I removed Matio and cuda_utils.cu in this pull request: https://github.com/luanfujun/deep-photo-styletransfer/pull/25 for neuralstyle_seg.lua. People should be able to run the script now without having to deal with Matio/Matlab/Octave or cuda_utils.cu.

luanfujun commented 7 years ago

But they have to deal with Matio/Matlab/Octave and cuda_utils.cu when it comes to deepmatting_seg.lua, which is a necessary final step of this algorithm.

Actually I can merge the two lua files into one file, which first generate neural style intermediate and then regularize it using matting laplacian. But the outcome is not obvious so I choose not to.

themightyoarfish commented 7 years ago

It's not clear to me why someone would want to run this on a CPU. It takes long on a GPU already, is there any hope that running all of this on a CPU won't take days for one image?

ad34 commented 7 years ago

@themightyoarfish maybe for people without a cuda capable device. but agreed it will take too much time and resources on the cpu

dieron commented 7 years ago

I'd like to try this solution, but my test server doesn't have NVIDIA card. So ability to run this without cuda device has sense.

themightyoarfish commented 7 years ago

Well sure, but I will take days so I'm not sure what the point is.

dieron commented 7 years ago

I'm newbie to all this CUDA stuff. I understand that GPU is right choice for such tasks but in my case I'm trying to see if I can run this project and get acceptable results.

I could try it on a small image to see if it works for me before trying on a "big" images with GPU.

If it really possible and doesn't "cost" too much, I'd like to be able to build or run it without GPU support.

I'm very impressed of the results and I really want to try it myself :-)

maxvonhippel commented 7 years ago

I have a very powerful MacBook Pro 15" Retina with a decent Intel Iris Pro graphics card and I've installed Cuda and all dependencies for this project without issue, but I can't get the project to either a) compile with Caffe pre-built from source to be CPU only, or b) compile in such a way so that the python script necessary after the step in Matlab works. I also second this feature, I would love to be able to experiment with this and contribute to this project, but I've been working off and on for 4 days now on getting this to run on my MacBook and have basically hit a complete impasse. (And as a college student studying pure maths, I can't really afford to just buy a second, Nvida-GPU, Cuda-enabled laptop ...)

themightyoarfish commented 7 years ago

How did you even install Cuda without a cuda device? Wasn't aware this is possible. Anyway I can just speak from experience with neural style that running this on a CPU for any kind of resolution takes more hours that I cared to wait, and the deep photo thing even takes much longer. So I recommend you familiarise yourself with AWS a bit and rent a P2 machine which you can ssh into. Spot instances are as cheap as 0.1$ per hour and I also have an AMI ready to go where most of the stuff is already installed. Also you should check out https://github.com/martinbenson/deep-photo-styletransfer which removes the Matlab dependency and fixes some other issues.

I have a very powerful MacBook Pro 15" Retina with a decent Intel Iris Pro graphics card and I've installed Cuda and all dependencies for this project without issue, but I can't get the project to either a) compile with Caffe pre-built from source to be CPU only, or b) compile in such a way so that the python script necessary after the step in Matlab works. I also second this feature, I would love to be able to experiment with this and contribute to this project, but I've been working off and on for 4 days now on getting this to run on my MacBook and have basically hit a complete impasse. (And as a college student studying pure maths, I can't really afford to just buy a second, Nvida-GPU, Cuda-enabled laptop ...)

maxvonhippel commented 7 years ago

@themightyoarfish Cuda installs without an issue on my MacBook Pro. I think this is because you can do a lot of Cuda stuff without actually having a Cuda GPU. For example, I was able to compile Caffe from source code to run on CPU by trivially modifying the Makefile. My impression is that Caffe is a Cuda dependent application.

Anyway, coming into this my hope was that I would figure out a way to use my Intel Iris Pro 1536 MB GPU in place of an NVIDIA GPU with Cuda, but it looks like that is not a possibility - either I use an NVIDIA Cuda enabled GPU, or I use CPU. And from what you and others have said, the latter option sounds like a bad one. I can definitely understand your point about time - I used to use a Cuda Core tower for point cloud analysis of FARO scans for edge-detection AR stuff, and even on a powerful tower with Cuda enabled NVIDIA GPU it took a long time. I'll give AWS a shot, given your description it sounds like that's my best option. Thanks!

tarvos21 commented 7 years ago

Hi, @maxvonhippel, I'm almost in the same situation with you, I am on a MacBook Pro 15 inch with Iris Pro graphics as well as a Radeon M370 graphics card, but they are both not Nvidia graphics, is it possible to run this repo on my laptop?

maxvonhippel commented 7 years ago

@tarvos21 I don't believe so. NVIDIA is starting to release (beta) drivers for more Macintosh GPUs, although I don't see anything about Intel Iris Pro or Radeon in there and I'm not sure that NVIDIA drivers if they existed would necessarily be sufficient to give you/us CUDA. NVIDIA did just release this crazy new TITAN Cuda-enabled GPU with beta support for mac OS and I imagine they're hoping a lot of us who are displeased with mac pro development (or lack thereof) will buy the titan for external use, but IMO that's a huge pain to set up. Possibly BisonBox or some such company will build a readymade product to that effect but it will still be very expensive. I think @themightyoarfish is definitely right at this point in time that AWS is going to be our best bet. I'm working on trying to find a professor at my university with a CUDA enabled NVIDIA machine I can use over Ssh this summer in return for contributions to some project or something, but yeah, at the end of the day I'm going to end up using someone else's machine, whether it's on AWS or my school's CS department or whatever.

Edit: @tarvos21 I managed to convince a biology professor to let me borrow an unbelievably powerful, crazy monster of a laptop (MSI Ghost, Intel CORE i7, 1.5TB internal storage, 16GB RAM, NVIDIA GeForce GTX 970M). Anyway, if you have some specific test or something you'd really like to run but you still don't have access to a suitable machine, feel free to shoot me an email and if it's not too much time I'll run it for you!

tarvos21 commented 7 years ago

@maxvonhippel, thanks a lot for your advice, so I think trying to run it on AWS instance with a decent GPU is my best choice.

nitish11 commented 6 years ago

I am working on AWS Deep Learning instance without GPU.

Is this code compatible with only GPU? Can this code be used with CPU? Let me know the feasibility and challenges, I can raise a PR for the same.