SeungjunNah / DeepDeblur_release

Deep Multi-scale CNN for Dynamic Scene Deblurring
686 stars 143 forks source link

Where can I get the REDS dataset? #40

Closed toggle1995 closed 4 years ago

toggle1995 commented 5 years ago

Thanks for you work, your dataset is very useful for image deblurring. I am very interested in your work and want to follow your work. So, I want to know where can i get your new REDS dataset.

SeungjunNah commented 5 years ago

Hi @toggle1995 ,

Thank you for your interests in our new dataset. It was provied for the video challenges in the NTIRE 2019 workshop. We will make the dataset available to the public in the near future. For now, we haven't finished the challenge jobs yet. I will let you know when we are ready to release the REDS dataset.

toggle1995 commented 5 years ago

Thank you! @SeungjunNah I look forward to getting your new dataset. Hope to get your inform when you release it.

kaikang90 commented 5 years ago

@SeungjunNah How about the new dataset release progress?

SeungjunNah commented 5 years ago

Hi @kaikang90, We are discussing when and how the dataset will be managed. I will try to notify you as soon as I can.

kaikang90 commented 5 years ago

Hi @SeungjunNah , Thanks for feedback and looking forward to your update.

SeungjunNah commented 5 years ago

@toggle1995 @kaikang90 Thank you for waiting. The REDS dataset is now published online! You can find the download links at https://seungjunnah.github.io

kaikang90 commented 5 years ago

Hi @SeungjunNah , got it, thank you!

SeungjunNah commented 5 years ago

@toggle1995 @kaikang90 I also made test input data accessible at https://seungjunnah.github.io/Datasets/reds We are planning to construct a submission site that provides evaluation results with public leaderboards. I will make another notice when the site is online.

zaansari-irde commented 5 years ago

I want to use your REDS dataset (https://seungjunnah.github.io/Datasets/reds) for image debluring.

But, I am unable to download few datasets (bigger files). Displayed message is, 'Access denied. Invalid user name or password.'

Kindly help.

SeungjunNah commented 5 years ago

@zaansari-irde

Which file did you exactly try to download? I'm not sure what's causing the problem because you don't have to log into anywhere. Each file is on google drive with public link share and it supports download from clicking.

You can also try using the download_REDS.py file which I also hosted there. For example, you can type the following command to download train_blur.zip and train_sharp.zip.

python download_REDS.py --train_blur --train_sharp
DavideA commented 4 years ago

Hello,

I am also experiencing issues downloading train_sharp.zip. Via the browser link, both Google Drive and SNU links fail quite early. Via the download script, 11GB out of 31 get downloaded (apparently a part of a multi-archive?) and they cannot be unzip.

Is there some other way I can download train_sharp?

Thank you, D

SeungjunNah commented 4 years ago

Hi @DavideA, All the files are single-zip files and they are not multi-archived. This could be related to traffic limitations. Google drive has a traffic limit whose bound is unspecific and SNU server traffic is currently almost fully occupied.

You can try downloading later or give me an email: seungjun.nah@gmail.com I can give you a temporary mirror link.

DavideA commented 4 years ago

Hi Seungjun,

thank you for your ready reply. I have been trying to download the dataset for a couple days now. If you could provide me a mirror link (even only for train_sharp), your help would be much appreciated.

Thank you, Best, D

Il giorno gio 25 giu 2020 alle ore 12:16 Seungjun Nah < notifications@github.com> ha scritto:

Hi @DavideA https://github.com/DavideA, All the files are single-zip files and they are not multi-archived. This could be related to traffic limitations. Google drive has a traffic limit whose bound is unspecific and SNU server traffic is currently almost fully occupied.

You can try downloading later or give me an email: seungjun.nah@gmail.com I can give you a temporary mirror link.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/SeungjunNah/DeepDeblur_release/issues/40#issuecomment-649448497, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABWIZJRBSRBFT5N56KVC5DDRYMPWLANCNFSM4HGZMCIQ .

SeungjunNah commented 4 years ago

Hi @DavideA Please leave me an email not on this public GitHub issue thread so that I could reply to you personally. Otherwise, the traffic through that link could be full again.