vocalpy / vak

A neural network framework for researchers studying acoustic communication
https://vak.readthedocs.io
BSD 3-Clause "New" or "Revised" License
76 stars 16 forks source link

ENH: Add AVA models and datasets #674

Open NickleDave opened 1 year ago

NickleDave commented 1 year ago

https://autoencoded-vocal-analysis.readthedocs.io/en/latest/index.html https://elifesciences.org/articles/67855 https://github.com/pearsonlab/autoencoded-vocal-analysis/tree/master

NickleDave commented 1 year ago

Discussed this with @marisbasha and @yardencsGitHub today. Updating here with some thoughts I've had

NickleDave commented 1 year ago

I don't think we need this for the initial implementation but noting for future work:

NickleDave commented 1 year ago

Tentative / rough to-do list for @marisbasha after our meeting today

marisbasha commented 1 year ago

@NickleDave I am having trouble with nox -s test-data-generate. I receive the following error: NotADirectoryError: Path specified for ``data_dir`` not found: tests/data_for_tests/source/audio_cbin_annot_notmat/gy6or6/032312 Which after inspection I see that tests/data_for_tests/source/ is an empty directory. I checked in the code for gy6or6, and I saw a script to download it. I put the data inside the audio_cbin_annot_notmat folder, but I get an error that says there's no .not.mat file in the directory, but I cannot find a link to download the data elsewhere.

Just to clarify, I should use my own "toy data" or does running vak prep tests/data_for_tests/configs/ConvEncoderUMAP_train_audio_cbin_annot_notmat.toml generate "toy data"? If that's the case, where should I download the data from?

NickleDave commented 1 year ago

Hey @marisbasha! Sorry you're running into this issue. It's probably something we haven't explained clearly enough.

If that's the case, where should I download the data from?

Just checking, did you already download the "source" test data as described here? https://vak.readthedocs.io/en/latest/development/contributors.html#download-test-data

To do that you would run

nox -s test-data-download-source

Just to clarify, I should use my own "toy data" or does running vak prep tests/data_for_tests/configs/ConvEncoderUMAP_train_audio_cbin_annot_notmat.toml generate "toy data"?

You are right that these are basically "toy" datasets, that are as small as possible. I tried to define the two different types in that section on the development set-up page but just in case it's not clear: the "source" data is inputs to vak, like audio and annotation files. You create the other type, the "generated" test data, when you run nox -s test-data-generate. This "generated" test data consists of (small) prepared datasets and results, some of which are used by the unit tests.

You don't actually need to generate this test data to be able to develop. I just suggested it as a fairly painless way to check that you were able to set up the environment correctly. The script that generates the test data should be able to run to completion without any errors.

I am almost finished with that feature branch that will fix the unit tests so you can run them to test what you are developing. That branch will also speed up the script that generates the test data considerable and reduce the size of the generated test data. https://github.com/vocalpy/vak/pull/693

Does that help?

marisbasha commented 1 year ago

Everything fine now. Thanks!

NickleDave commented 1 year ago

🙌 awesome, glad to hear it!

Will ping you here as soon as I get that branched merged, it does fix a couple minor bugs so you'll probably want to git pull them in along with the fixed tests

marisbasha commented 1 year ago

@NickleDave I have pushed again to my fork the parts divided by file. I am having trouble configuring the trainer. Could we have a brief discussion?

NickleDave commented 12 months ago

Ah whoops, sorry I missed this @marisbasha.

What you have so far looks great. I am reading through your code now to make sure I understand where you're at.

We can definitely discuss what to do with the trainer when we meet tomorrow.