dnouri / cuda-convnet

My fork of Alex Krizhevsky's cuda-convnet from 2013 where I added dropout, among other features.
http://code.google.com/p/cuda-convnet/
254 stars 147 forks source link

error when I tried this code in linux... #6

Closed teddykspark closed 10 years ago

teddykspark commented 10 years ago

When I replaced the codes in layer.py like the below...

    dic, name = self.dic, self.dic['name']
    dic['dropout'] = 0.0
    if name in mcp.sections():
        dic['dropout'] = mcp.safe_get_float(name, 'dropout', default=0.0)

it shown the error message which's about the 'name' in dic is not defined.. well, I changed the codes in layer.cu, layer.cuh, convert.cu, layer.py after comparing files with the original cuda-convnet codes.

How to solve this issues? please give me the tips..

dnouri commented 10 years ago

Sorry, no idea what you're trying to do. You replaced the codes? How and why?

teddykspark commented 10 years ago

I work on cuda 4.0 with cuda-convnet. so that's why your touched cuda-convnet project can't be built. It seems that your project is based on cuda 6.0.. so that's why I just tried to put the codes related with dropout into my codes. Unfortunately I couldn't replace the version of cuda to 6.0 from 4.0 yet.

In fact, there is no 'name' key in self.dic when I faced error. Where is 'name' key supposed to be existed?

Is there any solution for me? Please give me a tip for this case. Thanks.

dnouri commented 10 years ago

Aha! Well then try to use an older version from before the project was upgraded to use CUDA 5: d97cf372492f74c119cc003129b2f4a396ede878

teddykspark commented 10 years ago

Oh, I solved with your hints, thanks. Well I have one more question about the position of dropout in layer param.

In fact, I use conv, pool, fc layers in my net. What is the efficient way which layer params have dropout? Only for full connected layers? Actually I use 3 fc layers.. Please give me the hint for this. Thanks.

dnouri commented 10 years ago

The README links to two papers which I suggest you take a look at. And yes, people will usually apply dropout to fully connected layers. A good method seems to be to make your net learn well and overfit first, and then apply dropout.

teddykspark commented 10 years ago

Thanks Daniel.

WytheZ commented 9 years ago

@teddykspark , can you give me some details of how you deal with the cuda 4.0? My cuda version is cuda 4.2.. Thanks!!