AlibabaResearch / DAMO-ConvAI

DAMO-ConvAI: The official repository which contains the codebase for Alibaba DAMO Conversational AI.
MIT License
1.21k stars 186 forks source link

sunsql problem with starting training #3

Closed OverRipeThree49 closed 2 years ago

OverRipeThree49 commented 2 years ago

I followed the instructions in your readme markdown file and there was no problem with preprocessing. But when I tried to start training, the program exited with the following error message:

Traceback (most recent call last):
  File "/home/SunSQL/scripts/text2sql.py", line 113, in <module>
assert now[0].query == now[1].query
IndexError: list index out of range

and this is the code snippet in which the error emerged:

for wl in range(0, len(cur_dataset), 2):
                # print(wl.query)
                now = cur_dataset[wl : wl+2]
                # print(now[0].query)
                # print(now[1].query)
                assert now[0].query == now[1].query

So I checked the cur_dataset variable and found that there was only 3 entries when the error happened, which well explained the problem. When wl=2, there was only 1 entry in the now variable. I tried to circumvent this problem by breaking the loop when wl reaches len(cur_dataset)-1, but another error will happen when loss.backward() is executed telling me that data dimension is incorrect.

huybery commented 2 years ago

@OverRipeThree49 hi, sorry for causing this error, we have fixed the script, please try again

huybery commented 2 years ago

@OverRipeThree49 Is everything ready?

OverRipeThree49 commented 2 years ago

@huybery I preprocessed all the data again according to the run/preprocessing.sh you updated. Still no luck, the same error remains. Did I miss anything other than the run/preprocessing.sh file?

huybery commented 2 years ago

Can you re-pull the repository to rebuild it? I have no problem with that here.

OverRipeThree49 commented 2 years ago

@huybery I re-downloaded all the files and everything is OK. Thanks!

huybery commented 2 years ago

@OverRipeThree49 This is what I should do. Thank you again for your attention to our work!