Closed hattonjs closed 1 year ago
Ran into the same problem, so I am bumping up this issue.
I tried to use the docker image to experiment with lead sheet creation from sound files, but without the datasets it does not do anything. @chrisdonahue it seems like the download files got relocated at Stanford, however the redirects point to a nonexistent location on the target server. https://nlp.stanford.edu/data/* is redirected to https://downloads.cs.stanford.edu/nlp/data/... but there is no subdirectory with your files
Any chance to get this fixed? that would be great ... Regards, Ulrich
I have a copy of this dataset, you can leave your email here, then I can send the files.
Hello. Can I get the data set also ? Thank you and best regards
On Mon, Apr 10, 2023, 08:57 shansongliu @.***> wrote:
I have a copy of this dataset, you can leave your email here, then I can send the files.
— Reply to this email directly, view it on GitHub https://github.com/chrisdonahue/sheetsage/issues/9#issuecomment-1501471072, or unsubscribe https://github.com/notifications/unsubscribe-auth/APKAUPVRHDTLAAOXH6NASNTXAOVN5ANCNFSM6AAAAAAWLNGUZA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
My mailing address is @.*** thank you a lot. Have nice day
On Mon, Apr 10, 2023, 08:57 shansongliu @.***> wrote:
I have a copy of this dataset, you can leave your email here, then I can send the files.
— Reply to this email directly, view it on GitHub https://github.com/chrisdonahue/sheetsage/issues/9#issuecomment-1501471072, or unsubscribe https://github.com/notifications/unsubscribe-auth/APKAUPVRHDTLAAOXH6NASNTXAOVN5ANCNFSM6AAAAAAWLNGUZA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
My mailing address is @. thank you a lot. Have nice day … On Mon, Apr 10, 2023, 08:57 shansongliu @.> wrote: I have a copy of this dataset, you can leave your email here, then I can send the files. — Reply to this email directly, view it on GitHub <#9 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/APKAUPVRHDTLAAOXH6NASNTXAOVN5ANCNFSM6AAAAAAWLNGUZA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
The email address is full of star symbols.
firaterdogan24@gmail.com i hope this time it works thanks again
Sent by email.
My email address is jeff.hatton@hotmail.com. Please send me the dataset. Thank you!
@shansongliu - thank you for offering the files! This is the list of URLs that is being requested by the prepare.sh setup script.
https://nlp.stanford.edu/data/cdonahue/sheetsage/sheetsage/v0.2/0919_02_e0908_oafmelspecnorm/5b739ce5efa2b6d4d70c5f1feac802684f0ee6f4.cfg.json https://nlp.stanford.edu/data/cdonahue/sheetsage/sheetsage/v0.2/0919_02_e0908_oafmelspecnorm/step.pkl https://nlp.stanford.edu/data/cdonahue/sheetsage/sheetsage/v0.2/0919_00_e0830_oafmelspecnorm/model.pt https://nlp.stanford.edu/data/cdonahue/sheetsage/sheetsage/v0.2/oafmelspec_moments.npy
If there are more files needed , they are welcome :-) My mail address is github - at - ausschlafen -dot- com , but you may also upload to this share link, which might be easier for you. https://my.hidrive.com/share/ln1pcm6kpt
I would leave the files accessible there for a xouple of days for anybody else reading this thread - of course only if that is permitted by licenses for the files. Otherwise I would remove them after downloading
Thanks and best regards Ulrich
I have shared the google drive link as below ^_^: https://drive.google.com/file/d/1fZrF_Vkk90kHrmm9msG32WrZLnnIPhoB/view?usp=sharing
I have also created a google group named "sheetsage_user". Welcome to join in! You can leave your email below if you have interest to join this group. We can have further discussion there.
I have shared the google drive link as below ^_^: https://drive.google.com/file/d/1fZrF_Vkk90kHrmm9msG32WrZLnnIPhoB/view?usp=sharing
For the original music audio files, I'm afraid I cannot provide them due to some restrictions. The author's files I shared in this link have provided the URLs links.
Yes add me to the group. jeff.hatton@hotmail.com
Sorry about this everyone! The server where I was hosting these files went offline for an unknown reason. Working to restore access.
Wow, that was quite a long list of asset URLs to transfer and rewrite in the code. Thank you @chrisdonahue!! Also to @shansongliu for trying to help. Regarding the group I guess I would not join, since I am just a mere mortal playing around with stuff :)
Wow, that was quite a long list of asset URLs to transfer and rewrite in the code. Thank you @chrisdonahue!! Also to @shansongliu for trying to help. Regarding the group I guess I would not join, since I am just a mere mortal playing around with stuff :)
Sure, it depends on you haha. I'm trying to train a transcription system by the help of Chris's code. So I want to see if anyone else is interested in this project, we can discuss it together.
The links still don't work for download.
Sorry about this everyone! The server where I was hosting these files went offline for an unknown reason. Working to restore access.
It still doesn't work.
@hattonjs Basically it works - Chris has changed the code containing the URLs, the links now point to a different server, but you have to fiddle a bit.
The docker image containing the code is static, so the changed new code does not apply until a new image is created. Until then, you can do the following:
In addition to downloading the prepare.sh, grab the complete code from git. In my case, I placed in into a subdirectory "from-github" :
mkdir from-github && cd from-github && git clone https://github.com/chrisdonahue/sheetsage.git
Then modify your local copy of prepare.sh:
after SHEETSAGE_CACHE_DIR, add:
SHEETSAGE_GITHUB_DIR=$(pwd)/from-github
in the docker run command, add:
-v $SHEETSAGE_GITHUB_DIR/sheetsage/assets:/sheetsage/sheetsage/assets \
What you accomplish by doing this is : you override the "assets" directory that is inside the static docker image with the content of the new modified code, i.e. with the assets directory in your local directory - so the new URLs are used when running "prepare.sh".
There might be different or better ways to do it, but with these simple steps I was able to run prepare.sh, download everything that was needed for the non-jukebox operation , and I was able to create leadsheets from some Youtube URLs. Bingo :)
Sooner or later, the image may be rebuilt with the new code, and then hopefully with a newer version of yt-dlp on board for better handling of some youtube source files.
Regards Ulrich
I am trying to retrieve the datasets and getting a 404. Thoughts?