euancrabtree / Lipreading-PyTorch

Lip Reading in the Wild using ResNet and LSTMs in PyTorch
Other
58 stars 19 forks source link

about dataset #5

Open CXiaoDing opened 5 years ago

CXiaoDing commented 5 years ago

I run main.py, but I got an error. The error is as follows. Can someone here help me solve this problem? Thank you very much! error: Traceback (most recent call last): File "/home/cxd/PycharmProjects/LRW/main.py", line 20, in model.load_state_dict(torch.load(options["general"]["pretrainedmodelpath"])) File "/home/cxd/anaconda3/lib/python3.6/site-packages/torch/serialization.py", line 356, in load f = open(f, 'rb') FileNotFoundError: [Errno 2] No such file or directory: 'trainedmodel.pt'

sauravsolanki commented 5 years ago

Ya Please, even I having same issue. @CXiaoDing Did you get that how to solve it?

chendeguang commented 5 years ago

我现在也遇到了这个问题,请问你们有没有解决呢

CXiaoDing commented 5 years ago

哪个问题?不好意思,这个模型我好久没看了,有点不记得了

------------------ 原始邮件 ------------------ 发件人: "chendeguang"notifications@github.com; 发送时间: 2019年4月12日(星期五) 晚上6:08 收件人: "psyec1/Lipreading-PyTorch"Lipreading-PyTorch@noreply.github.com; 抄送: "明天你好"623955847@qq.com; "Mention"mention@noreply.github.com; 主题: Re: [psyec1/Lipreading-PyTorch] about dataset (#5)

我现在也遇到了这个问题,请问你们有没有解决呢

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

CXiaoDing commented 5 years ago

在options.toml文件里,有个loadpretrainedmodel,改为false就行了

CXiaoDing commented 5 years ago

Ya Please, even I having same issue. @CXiaoDing Did you get that how to solve it?

In the options.toml file, there is a parameter----loadpretrainedmodel, just change it to false

fheilz commented 5 years ago

@CXiaoDing Your solution works. @psyec1 Please close this issue.

lymgoforit commented 5 years ago

Traceback (most recent call last): File "main.py", line 26, in trainer = Trainer(options) File "/home/er607/桌面/lip/Lipreading-PyTorch-master/training.py", line 26, in init self.trainingdataset = LipreadingDataset(options["training"]["dataset"], "train") File "/home/er607/桌面/lip/Lipreading-PyTorch-master/data/dataset.py", line 31, in init self.label_list, self.file_list = self.build_file_list(directory, set) File "/home/er607/桌面/lip/Lipreading-PyTorch-master/data/dataset.py", line 9, in build_file_list labels = os.listdir(dir) FileNotFoundError: [Errno 2] No such file or directory: '/udisk/pszts-ssd/AV-ASR-data/BBC_Oxford/lipread_mp4' 请问这个是要自己去下载lipread_mp4吗?

chendeguang commented 5 years ago

是的,找中科院要。网址:http://vipl.ict.ac.cn/team.php?id=10------------------ Original ------------------ From: "lymgoforit"notifications@github.com Date: Tue, May 14, 2019 09:10 PM To: "psyec1/Lipreading-PyTorch"Lipreading-PyTorch@noreply.github.com; Cc: "chendeguang"20142285@stu.nun.edu.cn;"Comment"comment@noreply.github.com; Subject: Re: [psyec1/Lipreading-PyTorch] about dataset (#5)

Traceback (most recent call last): File "main.py", line 26, in trainer = Trainer(options) File "/home/er607/桌面/lip/Lipreading-PyTorch-master/training.py", line 26, in init self.trainingdataset = LipreadingDataset(options["training"]["dataset"], "train") File "/home/er607/桌面/lip/Lipreading-PyTorch-master/data/dataset.py", line 31, in init self.label_list, self.file_list = self.build_file_list(directory, set) File "/home/er607/桌面/lip/Lipreading-PyTorch-master/data/dataset.py", line 9, in build_file_list labels = os.listdir(dir) FileNotFoundError: [Errno 2] No such file or directory: '/udisk/pszts-ssd/AV-ASR-data/BBC_Oxford/lipread_mp4' 请问这个是要自己去下载lipread_mp4吗?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

lymgoforit commented 5 years ago

@chendeguang 已经找到了并下载了 多谢

PonderK commented 5 years ago

@lymgoforit 您好,您能和我分享一下这份代码的数据集吗?上面的链接打不开了,麻烦您了

YYingcute commented 5 years ago

@chendeguang 已经找到了并下载了 多谢 请问你是怎么找到下载的呢?方便的话可以告诉我一下吗

YYingcute commented 5 years ago

@lymgoforit 请问可以分享一下数据集吗?我的邮箱 ivory.yang.ying@gmail.com。感激。。

cyz-951024 commented 2 years ago

你好,可以分享lipread_MP4文件吗,邮箱 henucyz@163.com 。万分感谢