sign-language-processing / datasets

TFDS data loaders for sign language datasets.
https://sign-language-processing.github.io/#existing-datasets
79 stars 24 forks source link

Several datasets not loading, with PermissionError when running PyTest, caused by `with_open` on folder? #55

Open cleong110 opened 4 months ago

cleong110 commented 4 months ago

SignBank and Sign2Mint are having loading issues in https://github.com/sign-language-processing/datasets/blob/master/examples/load.ipynb, perhaps this is why.

Running PyTest as noted in #53, I find that some datasets have PermissionErrors like the one below. https://stackoverflow.com/questions/36434764/permissionerror-errno-13-permission-denied suggests this is caused suggests that this might be caused by attempting to use with open on a folder instead of a file.

Example:

self = <datasets.sign_language_datasets.datasets.signbank.signbank.SignBank object at 0x000001A8C5C903D0>
dl_manager = <tensorflow_datasets.core.download.download_manager.DownloadManager object at 0x000001A8C5C93790>

    def _split_generators(self, dl_manager: tfds.download.DownloadManager):
        """Returns SplitGenerators."""
        dataset_warning(self)

        index = dl_manager.download("http://signbank.org/signpuddle2.0/data/spml/")
        regex = r"\"sgn[\d]+.spml\""
>       with open(index, "r", encoding="utf-8") as f:
E       PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\signbank\\dummy_data'

sign_language_datasets\datasets\signbank\signbank.py:218: PermissionError

Affected Datasets

Datasets that may be affected due to this:

Sign2MINT ```python self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): dataset_warning(self) annotations_path = dl_manager.download("https://sign2mint.de/api/entries/all/") local_videos = {} if self._builder_config.include_video and self._builder_config.process_video: > with open(annotations_path, "r", encoding="utf-8") as f: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\sign2mint\\dummy_data' sign_language_datasets\datasets\sign2mint\sign2mint.py:88: PermissionError ``` ![image](https://github.com/sign-language-processing/datasets/assets/122366389/99f58b4c-0f22-49d2-98d9-0f2d4e2ccb62)
SignBank ```python self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): """Returns SplitGenerators.""" dataset_warning(self) index = dl_manager.download("http://signbank.org/signpuddle2.0/data/spml/") regex = r"\"sgn[\d]+.spml\"" > with open(index, "r", encoding="utf-8") as f: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\signbank\\dummy_data' sign_language_datasets\datasets\signbank\signbank.py:218: PermissionError ``` ![image](https://github.com/sign-language-processing/datasets/assets/122366389/3f65a9f9-d461-4e98-887a-d6604f1a1bdb) ![image](https://github.com/sign-language-processing/datasets/assets/122366389/bfd865eb-80c6-42eb-ae74-b069e0b56e57)
ASL Lex ``` self = csv_path = WindowsGPath('C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\asl_lex\\dummy_data') def _generate_examples(self, csv_path: str): """Yields examples.""" > with open(csv_path, "r", encoding="ISO-8859-1") as csvfile: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\asl_lex\\dummy_data' sign_language_datasets\datasets\asl_lex\asl_lex.py:72: PermissionError ```
DGS Corpus ``` filepath_in = 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmp6sc7ayxv', filepath_out = 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmp6sc7ayxv.gz' def _gzip_file(filepath_in: str, filepath_out: str) -> None: > with open(filepath_in, "rb") as filehandle_in: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmp6sc7ayxv' sign_language_datasets\datasets\dgs_corpus\dgs_corpus_test.py:75: PermissionError ```
Dicta-Sign ``` self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): dataset_warning(self) concepts_path = dl_manager.download("https://www.sign-lang.uni-hamburg.de/dicta-sign/portal/concepts/concepts_eng.html") if self._builder_config.include_pose is not None: poses_path = dl_manager.download_and_extract(_POSE_URLS[self._builder_config.include_pose]) else: poses_path = None regex = r" with open(concepts_path, "r", encoding="utf-8") as concepts_f: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\dicta_sign\\dummy_data' sign_language_datasets\datasets\dicta_sign\dicta_sign.py:92: PermissionError ```

I tried these in The Colab Loading Script, and SignBank and Sign2Mint actually are crashing.

For SignBank the issue might be at https://github.com/sign-language-processing/datasets/blob/3aa515c0da9f3c5f43db5a8cc407a7abbe083db0/sign_language_datasets/datasets/signbank/signbank.py#L216

For Sign2Mint the issue might be at https://github.com/sign-language-processing/datasets/blob/3aa515c0da9f3c5f43db5a8cc407a7abbe083db0/sign_language_datasets/datasets/sign2mint/sign2mint.py#L84

cleong110 commented 4 months ago

Below is the FULL PYTEST OUTPUT, very long!

Full PyTest output - Click to expand ``` (sign_language_datasets_source) C:\Users\Colin\projects\sign-language\colin_pull_requesting\datasets>pytest =================================================================== test session starts =================================================================== platform win32 -- Python 3.10.13, pytest-8.0.1, pluggy-1.4.0 -- C:\Users\Colin\miniconda3\envs\sign_language_datasets_source\python.exe cachedir: .pytest_cache rootdir: C:\Users\Colin\projects\sign-language\colin_pull_requesting\datasets configfile: pytest.ini plugins: cov-4.1.0 collected 126 items sign_language_datasets/datasets/asl_lex/asl_lex_test.py::AslLexTest::test_baseclass PASSED [ 0%] sign_language_datasets/datasets/asl_lex/asl_lex_test.py::AslLexTest::test_download_and_prepare_as_dataset FAILED [ 1%] sign_language_datasets/datasets/asl_lex/asl_lex_test.py::AslLexTest::test_info PASSED [ 2%] sign_language_datasets/datasets/asl_lex/asl_lex_test.py::AslLexTest::test_registered PASSED [ 3%] sign_language_datasets/datasets/asl_lex/asl_lex_test.py::AslLexTest::test_session SKIPPED (Not a test.) [ 3%] sign_language_datasets/datasets/asl_lex/asl_lex_test.py::AslLexTest::test_tags_are_valid PASSED [ 4%] sign_language_datasets/datasets/aslg_pc12/aslg_pc12_test.py::AslgPc12Test::test_baseclass PASSED [ 5%] sign_language_datasets/datasets/aslg_pc12/aslg_pc12_test.py::AslgPc12Test::test_download_and_prepare_as_dataset PASSED [ 6%] sign_language_datasets/datasets/aslg_pc12/aslg_pc12_test.py::AslgPc12Test::test_info PASSED [ 7%] sign_language_datasets/datasets/aslg_pc12/aslg_pc12_test.py::AslgPc12Test::test_registered PASSED [ 7%] sign_language_datasets/datasets/aslg_pc12/aslg_pc12_test.py::AslgPc12Test::test_session SKIPPED (Not a test.) [ 8%] sign_language_datasets/datasets/aslg_pc12/aslg_pc12_test.py::AslgPc12Test::test_tags_are_valid PASSED [ 9%] sign_language_datasets/datasets/autsl/autsl_test.py::AutslTest::test_baseclass PASSED [ 10%] sign_language_datasets/datasets/autsl/autsl_test.py::AutslTest::test_download_and_prepare_as_dataset FAILED [ 11%] sign_language_datasets/datasets/autsl/autsl_test.py::AutslTest::test_info PASSED [ 11%] sign_language_datasets/datasets/autsl/autsl_test.py::AutslTest::test_registered PASSED [ 12%] sign_language_datasets/datasets/autsl/autsl_test.py::AutslTest::test_session SKIPPED (Not a test.) [ 13%] sign_language_datasets/datasets/autsl/autsl_test.py::AutslTest::test_tags_are_valid PASSED [ 14%] sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_baseclass FAILED [ 15%] sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_download_and_prepare_as_dataset FAILED [ 15%] sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_info FAILED [ 16%] sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_registered FAILED [ 17%] sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_session SKIPPED (Not a test.) [ 18%] sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_tags_are_valid FAILED [ 19%] sign_language_datasets/datasets/chicagofswild/chicagofswild_test.py::ChicagoFSWildTest::test_baseclass PASSED [ 19%] sign_language_datasets/datasets/chicagofswild/chicagofswild_test.py::ChicagoFSWildTest::test_download_and_prepare_as_dataset FAILED [ 20%] sign_language_datasets/datasets/chicagofswild/chicagofswild_test.py::ChicagoFSWildTest::test_info PASSED [ 21%] sign_language_datasets/datasets/chicagofswild/chicagofswild_test.py::ChicagoFSWildTest::test_registered PASSED [ 22%] sign_language_datasets/datasets/chicagofswild/chicagofswild_test.py::ChicagoFSWildTest::test_session SKIPPED (Not a test.) [ 23%] sign_language_datasets/datasets/chicagofswild/chicagofswild_test.py::ChicagoFSWildTest::test_tags_are_valid PASSED [ 23%] sign_language_datasets/datasets/dgs_corpus/dgs_corpus_test.py::TestDgsCorpusAuxiliaryFunctions::test_convert_dgs_dict_to_openpose_frames PASSED [ 24%] sign_language_datasets/datasets/dgs_corpus/dgs_corpus_test.py::TestDgsCorpusAuxiliaryFunctions::test_get_poses_return_type FAILED [ 25%] sign_language_datasets/datasets/dgs_corpus/dgs_corpus_test.py::TestDgsCorpusAuxiliaryFunctions::test_get_poses_subset_of_camera_names FAILED [ 26%] sign_language_datasets/datasets/dgs_types/dgs_types_test.py::DGSTypesTest::test_baseclass PASSED [ 26%] sign_language_datasets/datasets/dgs_types/dgs_types_test.py::DGSTypesTest::test_download_and_prepare_as_dataset FAILED [ 27%] sign_language_datasets/datasets/dgs_types/dgs_types_test.py::DGSTypesTest::test_info PASSED [ 28%] sign_language_datasets/datasets/dgs_types/dgs_types_test.py::DGSTypesTest::test_registered PASSED [ 29%] sign_language_datasets/datasets/dgs_types/dgs_types_test.py::DGSTypesTest::test_session SKIPPED (Not a test.) [ 30%] sign_language_datasets/datasets/dgs_types/dgs_types_test.py::DGSTypesTest::test_tags_are_valid PASSED [ 30%] sign_language_datasets/datasets/dicta_sign/dicta_sign_test.py::DictaSignTest::test_baseclass PASSED [ 31%] sign_language_datasets/datasets/dicta_sign/dicta_sign_test.py::DictaSignTest::test_download_and_prepare_as_dataset FAILED [ 32%] sign_language_datasets/datasets/dicta_sign/dicta_sign_test.py::DictaSignTest::test_info PASSED [ 33%] sign_language_datasets/datasets/dicta_sign/dicta_sign_test.py::DictaSignTest::test_registered PASSED [ 34%] sign_language_datasets/datasets/dicta_sign/dicta_sign_test.py::DictaSignTest::test_session SKIPPED (Not a test.) [ 34%] sign_language_datasets/datasets/dicta_sign/dicta_sign_test.py::DictaSignTest::test_tags_are_valid PASSED [ 35%] sign_language_datasets/datasets/how2sign/how2sign_test.py::How2signTest::test_baseclass PASSED [ 36%] sign_language_datasets/datasets/how2sign/how2sign_test.py::How2signTest::test_download_and_prepare_as_dataset FAILED [ 37%] sign_language_datasets/datasets/how2sign/how2sign_test.py::How2signTest::test_info PASSED [ 38%] sign_language_datasets/datasets/how2sign/how2sign_test.py::How2signTest::test_registered PASSED [ 38%] sign_language_datasets/datasets/how2sign/how2sign_test.py::How2signTest::test_session SKIPPED (Not a test.) [ 39%] sign_language_datasets/datasets/how2sign/how2sign_test.py::How2signTest::test_tags_are_valid PASSED [ 40%] sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_baseclass FAILED [ 41%] sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_download_and_prepare_as_dataset FAILED [ 42%] sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_info FAILED [ 42%] sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_registered FAILED [ 43%] sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_session SKIPPED (Not a test.) [ 44%] sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_tags_are_valid FAILED [ 45%] sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_baseclass FAILED [ 46%] sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_download_and_prepare_as_dataset FAILED [ 46%] sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_info FAILED [ 47%] sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_registered FAILED [ 48%] sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_session SKIPPED (Not a test.) [ 49%] sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_tags_are_valid FAILED [ 50%] sign_language_datasets/datasets/ngt_corpus/ngt_corpus_test.py::TestNgtCorpus::test_ngt_corpus_loader PASSED [ 50%] sign_language_datasets/datasets/ngt_corpus/ngt_corpus_test.py::TestNgtCorpusUtils::test_ngt_get_elan_sentences PASSED [ 51%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TVideosTest::test_baseclass PASSED [ 52%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TVideosTest::test_download_and_prepare_as_dataset PASSED [ 53%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TVideosTest::test_info PASSED [ 53%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TVideosTest::test_registered PASSED [ 54%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TVideosTest::test_session SKIPPED (Not a test.) [ 55%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TVideosTest::test_tags_are_valid PASSED [ 56%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TPosesTest::test_baseclass PASSED [ 57%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TPosesTest::test_download_and_prepare_as_dataset FAILED [ 57%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TPosesTest::test_info PASSED [ 58%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TPosesTest::test_registered PASSED [ 59%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TPosesTest::test_session SKIPPED (Not a test.) [ 60%] sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TPosesTest::test_tags_are_valid PASSED [ 61%] sign_language_datasets/datasets/sign2mint/sign2mint_test.py::Sign2MINTTest::test_baseclass PASSED [ 61%] sign_language_datasets/datasets/sign2mint/sign2mint_test.py::Sign2MINTTest::test_download_and_prepare_as_dataset FAILED [ 62%] sign_language_datasets/datasets/sign2mint/sign2mint_test.py::Sign2MINTTest::test_info PASSED [ 63%] sign_language_datasets/datasets/sign2mint/sign2mint_test.py::Sign2MINTTest::test_registered PASSED [ 64%] sign_language_datasets/datasets/sign2mint/sign2mint_test.py::Sign2MINTTest::test_session SKIPPED (Not a test.) [ 65%] sign_language_datasets/datasets/sign2mint/sign2mint_test.py::Sign2MINTTest::test_tags_are_valid PASSED [ 65%] sign_language_datasets/datasets/sign_wordnet/sign_wordnet_test.py::SignWordnetTest::test_baseclass PASSED [ 66%] sign_language_datasets/datasets/sign_wordnet/sign_wordnet_test.py::SignWordnetTest::test_download_and_prepare_as_dataset FAILED [ 67%] sign_language_datasets/datasets/sign_wordnet/sign_wordnet_test.py::SignWordnetTest::test_info PASSED [ 68%] sign_language_datasets/datasets/sign_wordnet/sign_wordnet_test.py::SignWordnetTest::test_registered PASSED [ 69%] sign_language_datasets/datasets/sign_wordnet/sign_wordnet_test.py::SignWordnetTest::test_session SKIPPED (Not a test.) [ 69%] sign_language_datasets/datasets/sign_wordnet/sign_wordnet_test.py::SignWordnetTest::test_tags_are_valid PASSED [ 70%] sign_language_datasets/datasets/signbank/signbank_test.py::SignbankTest::test_baseclass PASSED [ 71%] sign_language_datasets/datasets/signbank/signbank_test.py::SignbankTest::test_download_and_prepare_as_dataset FAILED [ 72%] sign_language_datasets/datasets/signbank/signbank_test.py::SignbankTest::test_info PASSED [ 73%] sign_language_datasets/datasets/signbank/signbank_test.py::SignbankTest::test_registered PASSED [ 73%] sign_language_datasets/datasets/signbank/signbank_test.py::SignbankTest::test_session SKIPPED (Not a test.) [ 74%] sign_language_datasets/datasets/signbank/signbank_test.py::SignbankTest::test_tags_are_valid PASSED [ 75%] sign_language_datasets/datasets/signsuisse/signsuisse_test.py::SignSuisseTest::test_baseclass PASSED [ 76%] sign_language_datasets/datasets/signsuisse/signsuisse_test.py::SignSuisseTest::test_download_and_prepare_as_dataset FAILED [ 76%] sign_language_datasets/datasets/signsuisse/signsuisse_test.py::SignSuisseTest::test_info PASSED [ 77%] sign_language_datasets/datasets/signsuisse/signsuisse_test.py::SignSuisseTest::test_registered PASSED [ 78%] sign_language_datasets/datasets/signsuisse/signsuisse_test.py::SignSuisseTest::test_session SKIPPED (Not a test.) [ 79%] sign_language_datasets/datasets/signsuisse/signsuisse_test.py::SignSuisseTest::test_tags_are_valid PASSED [ 80%] sign_language_datasets/datasets/signtyp/signtyp_test.py::SignTypTest::test_baseclass PASSED [ 80%] sign_language_datasets/datasets/signtyp/signtyp_test.py::SignTypTest::test_download_and_prepare_as_dataset FAILED [ 81%] sign_language_datasets/datasets/signtyp/signtyp_test.py::SignTypTest::test_info PASSED [ 82%] sign_language_datasets/datasets/signtyp/signtyp_test.py::SignTypTest::test_registered PASSED [ 83%] sign_language_datasets/datasets/signtyp/signtyp_test.py::SignTypTest::test_session SKIPPED (Not a test.) [ 84%] sign_language_datasets/datasets/signtyp/signtyp_test.py::SignTypTest::test_tags_are_valid PASSED [ 84%] sign_language_datasets/datasets/swojs_glossario/swojs_glossario_test.py::SwojsGlossarioTest::test_baseclass PASSED [ 85%] sign_language_datasets/datasets/swojs_glossario/swojs_glossario_test.py::SwojsGlossarioTest::test_download_and_prepare_as_dataset FAILED [ 86%] sign_language_datasets/datasets/swojs_glossario/swojs_glossario_test.py::SwojsGlossarioTest::test_info PASSED [ 87%] sign_language_datasets/datasets/swojs_glossario/swojs_glossario_test.py::SwojsGlossarioTest::test_registered PASSED [ 88%] sign_language_datasets/datasets/swojs_glossario/swojs_glossario_test.py::SwojsGlossarioTest::test_session SKIPPED (Not a test.) [ 88%] sign_language_datasets/datasets/swojs_glossario/swojs_glossario_test.py::SwojsGlossarioTest::test_tags_are_valid PASSED [ 89%] sign_language_datasets/datasets/wlasl/wlasl_test.py::WlaslTest::test_baseclass PASSED [ 90%] sign_language_datasets/datasets/wlasl/wlasl_test.py::WlaslTest::test_download_and_prepare_as_dataset FAILED [ 91%] sign_language_datasets/datasets/wlasl/wlasl_test.py::WlaslTest::test_info PASSED [ 92%] sign_language_datasets/datasets/wlasl/wlasl_test.py::WlaslTest::test_registered PASSED [ 92%] sign_language_datasets/datasets/wlasl/wlasl_test.py::WlaslTest::test_session SKIPPED (Not a test.) [ 93%] sign_language_datasets/datasets/wlasl/wlasl_test.py::WlaslTest::test_tags_are_valid PASSED [ 94%] sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_baseclass FAILED [ 95%] sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_download_and_prepare_as_dataset FAILED [ 96%] sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_info FAILED [ 96%] sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_registered FAILED [ 97%] sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_session SKIPPED (Not a test.) [ 98%] sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_tags_are_valid FAILED [ 99%] sign_language_datasets/utils/signwriting/ocr/ocr_test.py::TestOCR::test_should_extract_fsw_from_image FAILED [100%] ======================================================================== FAILURES ========================================================================= _____________________________________________________ AslLexTest.test_download_and_prepare_as_dataset _____________________________________________________ self = , args = (), kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1600: in _download_and_prepare future = split_builder.submit_split_generation( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\split_builder.py:332: in submit_split_generation return self._build_from_generator(**build_kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\split_builder.py:392: in _build_from_generator for key, example in utils.tqdm( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tqdm\std.py:1181: in __iter__ for obj in iterable: ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:495: in _iter_examples for key, ex in generator: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = csv_path = WindowsGPath('C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\asl_lex\\dummy_data') def _generate_examples(self, csv_path: str): """Yields examples.""" > with open(csv_path, "r", encoding="ISO-8859-1") as csvfile: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\asl_lex\\dummy_data' sign_language_datasets\datasets\asl_lex\asl_lex.py:72: PermissionError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 2 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpsv87zewv\asl_lex\default\2.0.0... ------------------------------------------------------------------ Captured stderr call ------------------------------------------------------------------- _____________________________________________________ AutslTest.test_download_and_prepare_as_dataset ______________________________________________________ self = , args = (), kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg sign_language_datasets\datasets\autsl\autsl.py:184: in _split_generators train_parts = self._download_and_extract_multipart(dl_manager, url=_TRAIN_VIDEOS, parts=18, sign_language_datasets\datasets\autsl\autsl.py:139: in _download_and_extract_multipart if not os.path.isfile(output_path): ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\unittest\mock.py:1114: in __call__ return self._mock_call(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\unittest\mock.py:1118: in _mock_call return self._execute_mock_call(*args, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = args = ('C:\\Users\\Colin\\AppData\\Local\\Temp\\run0vq9m9l6\\tmpzhzofbgu\\downloads\\158.109.8.102_AuTSL_train_train_set_vfbha3DHr77hYLRl-ycIQ771v0XzxgCNOh2iXj_vS9Nsv4kxo.zip',) kwargs = {}, effect = AssertionError('Do not use `os`, but `tf.io.gfile` module instead. This makes code compatible with more filesystems.') def _execute_mock_call(self, /, *args, **kwargs): # separate from _increment_mock_call so that awaited functions are # executed separately from their call, also AsyncMock overrides this method effect = self.side_effect if effect is not None: if _is_exception(effect): > raise effect E AssertionError: Do not use `os`, but `tf.io.gfile` module instead. This makes code compatible with more filesystems. ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\unittest\mock.py:1173: AssertionError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 3 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpzhzofbgu\autsl\default\1.0.0... output_path C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpzhzofbgu\downloads\158.109.8.102_AuTSL_train_train_set_vfbha3DHr77hYLRl-ycIQ771v0XzxgCNOh2iXj_vS9Nsv4kxo.zip output_path_extracted C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpzhzofbgu\downloads\extracted\158.109.8.102_AuTSL_train_train_set_vfbha3DHr77hYLRl-ycIQ771v0XzxgCNOh2iXj_vS9Nsv4kxo.zip ______________________________________________________________ BslCorpusTest.test_baseclass _______________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError ___________________________________________________ BslCorpusTest.test_download_and_prepare_as_dataset ____________________________________________________ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: in setUp self.builder = self._make_builder() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError _________________________________________________________________ BslCorpusTest.test_info _________________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError ______________________________________________________________ BslCorpusTest.test_registered ______________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError ____________________________________________________________ BslCorpusTest.test_tags_are_valid ____________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError _________________________________________________ ChicagoFSWildTest.test_download_and_prepare_as_dataset __________________________________________________ self = args = (), kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1600: in _download_and_prepare future = split_builder.submit_split_generation( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\split_builder.py:332: in submit_split_generation return self._build_from_generator(**build_kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\split_builder.py:392: in _build_from_generator for key, example in utils.tqdm( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tqdm\std.py:1181: in __iter__ for obj in iterable: ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:495: in _iter_examples for key, ex in generator: sign_language_datasets\datasets\chicagofswild\chicagofswild.py:128: in _generate_examples tar = tarfile.open(path.join(archive_directory, v_name + "-Frames.tgz")) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\tarfile.py:1804: in open return func(name, "r", fileobj, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\tarfile.py:1870: in gzopen fileobj = GzipFile(name, mode + "b", compresslevel, fileobj) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <[AttributeError("'GzipFile' object has no attribute 'fileobj'") raised in repr()] GzipFile object at 0x1a8c413d1e0> filename = 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\chicagofswild\\dummy_data\\ChicagoFSWildPlus-Frames.tgz' mode = 'rb', compresslevel = 9, fileobj = None, mtime = None def __init__(self, filename=None, mode=None, compresslevel=_COMPRESS_LEVEL_BEST, fileobj=None, mtime=None): """Constructor for the GzipFile class. At least one of fileobj and filename must be given a non-trivial value. The new class instance is based on fileobj, which can be a regular file, an io.BytesIO object, or any other object which simulates a file. It defaults to None, in which case filename is opened to provide a file object. When fileobj is not None, the filename argument is only used to be included in the gzip file header, which may include the original filename of the uncompressed file. It defaults to the filename of fileobj, if discernible; otherwise, it defaults to the empty string, and in this case the original filename is not included in the header. The mode argument can be any of 'r', 'rb', 'a', 'ab', 'w', 'wb', 'x', or 'xb' depending on whether the file will be read or written. The default is the mode of fileobj if discernible; otherwise, the default is 'rb'. A mode of 'r' is equivalent to one of 'rb', and similarly for 'w' and 'wb', 'a' and 'ab', and 'x' and 'xb'. The compresslevel argument is an integer from 0 to 9 controlling the level of compression; 1 is fastest and produces the least compression, and 9 is slowest and produces the most compression. 0 is no compression at all. The default is 9. The mtime argument is an optional numeric timestamp to be written to the last modification time field in the stream when compressing. If omitted or None, the current time is used. """ if mode and ('t' in mode or 'U' in mode): raise ValueError("Invalid mode: {!r}".format(mode)) if mode and 'b' not in mode: mode += 'b' if fileobj is None: > fileobj = self.myfileobj = builtins.open(filename, mode or 'rb') E FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\chicagofswild\\dummy_data\\ChicagoFSWildPlus-Frames.tgz' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\gzip.py:174: FileNotFoundError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 1 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpotfd0dyc\chicago_fs_wild\default\2.0.0... Extracting Frames Archive ------------------------------------------------------------------ Captured stderr call ------------------------------------------------------------------- _______________________________________________ TestDgsCorpusAuxiliaryFunctions.test_get_poses_return_type ________________________________________________ self = def test_get_poses_return_type(self): camera_names_in_mock_data = ["a", "b", "c"] num_frames_in_mock_data = 10 num_people_in_mock_data = 1 people_to_extract = {"a", "b"} > with _create_tmp_dgs_openpose_file( camera_names=camera_names_in_mock_data, num_frames=num_frames_in_mock_data, num_people=num_people_in_mock_data ) as filepath: sign_language_datasets\datasets\dgs_corpus\dgs_corpus_test.py:110: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\contextlib.py:135: in __enter__ return next(self.gen) sign_language_datasets\datasets\dgs_corpus\dgs_corpus_test.py:86: in _create_tmp_dgs_openpose_file _gzip_file(filehandle.name, filepath_zipped) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ filepath_in = 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmp6sc7ayxv', filepath_out = 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmp6sc7ayxv.gz' def _gzip_file(filepath_in: str, filepath_out: str) -> None: > with open(filepath_in, "rb") as filehandle_in: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmp6sc7ayxv' sign_language_datasets\datasets\dgs_corpus\dgs_corpus_test.py:75: PermissionError __________________________________________ TestDgsCorpusAuxiliaryFunctions.test_get_poses_subset_of_camera_names __________________________________________ self = def test_get_poses_subset_of_camera_names(self): camera_names_in_mock_data = ["a2", "b1", "c5"] num_frames_in_mock_data = 10 num_people_in_mock_data = 1 people_to_extract = {"a", "b"} > with _create_tmp_dgs_openpose_file( camera_names=camera_names_in_mock_data, num_frames=num_frames_in_mock_data, num_people=num_people_in_mock_data ) as filepath: sign_language_datasets\datasets\dgs_corpus\dgs_corpus_test.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\contextlib.py:135: in __enter__ return next(self.gen) sign_language_datasets\datasets\dgs_corpus\dgs_corpus_test.py:86: in _create_tmp_dgs_openpose_file _gzip_file(filehandle.name, filepath_zipped) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ filepath_in = 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmpqgafthwv', filepath_out = 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmpqgafthwv.gz' def _gzip_file(filepath_in: str, filepath_out: str) -> None: > with open(filepath_in, "rb") as filehandle_in: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmpqgafthwv' sign_language_datasets\datasets\dgs_corpus\dgs_corpus_test.py:75: PermissionError ____________________________________________________ DGSTypesTest.test_download_and_prepare_as_dataset ____________________________________________________ self = , args = () kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg sign_language_datasets\datasets\dgs_types\dgs_types.py:181: in _split_generators galex_data = self.get_galex_data(dl_manager) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = dl_manager = def get_galex_data(self, dl_manager: tfds.download.DownloadManager): GALEX = "https://www.sign-lang.uni-hamburg.de/galex/" index_urls = [f"{GALEX}tystatus/x{i}.html" for i in [2, 3, 4]] gloss_urls = set() > for p in dl_manager.download(index_urls): E TypeError: 'WindowsGPath' object is not iterable sign_language_datasets\datasets\dgs_types\dgs_types.py:94: TypeError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 3 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmp367l_2an\dgs_types\default\3.0.0... ___________________________________________________ DictaSignTest.test_download_and_prepare_as_dataset ____________________________________________________ self = , args = () kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): dataset_warning(self) concepts_path = dl_manager.download("https://www.sign-lang.uni-hamburg.de/dicta-sign/portal/concepts/concepts_eng.html") if self._builder_config.include_pose is not None: poses_path = dl_manager.download_and_extract(_POSE_URLS[self._builder_config.include_pose]) else: poses_path = None regex = r" with open(concepts_path, "r", encoding="utf-8") as concepts_f: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\dicta_sign\\dummy_data' sign_language_datasets\datasets\dicta_sign\dicta_sign.py:92: PermissionError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 3 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpuoowkpvg\dicta_sign\default\1.0.0... ____________________________________________________ How2signTest.test_download_and_prepare_as_dataset ____________________________________________________ self = , args = () kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): """Returns SplitGenerators.""" dataset_warning(self) # Define what files are required to download download_keys = [] if self._builder_config.include_video is not None: download_keys += ["rgb_clips_front", "rgb_clips_side"] if self._builder_config.include_pose is not None: download_keys += ["bfh_2d_front", "bfh_2d_side"] urls = chain.from_iterable([[split[k] for k in download_keys] for split in _SPLITS.values()]) urls = [url for url in urls if url is not None] downloads = dl_manager.download_and_extract(urls) > url_map = {u: d for u, d in zip(urls, downloads)} # Map local paths E TypeError: 'WindowsGPath' object is not iterable sign_language_datasets\datasets\how2sign\how2sign.py:108: TypeError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 1 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmplab2hqro\how2_sign\default\1.0.0... _____________________________________________________________ MediapiSkelTest.test_baseclass ______________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _info(self) -> tfds.core.DatasetInfo: """Returns the dataset metadata.""" features = { "id": tfds.features.Text(), "metadata": { "fps": tf.int32, "height": tf.int32, "width": tf.int32, "duration": tf.float32, "frames": tf.int32, }, "subtitles": tfds.features.Sequence({ "start_time": tf.float32, "end_time": tf.float32, "text": tfds.features.Text(), }), } if self._builder_config.include_pose is not None: pose_header_path = _POSE_HEADERS[self._builder_config.include_pose] if self._builder_config.fps is not None: print("Pose FPS is not implemented for mediapi_skel dataset (since the original fps is not consistent)") if self._builder_config.include_pose == "openpose": pose_shape = (None, 1, 137, 2) > raise NotImplementedError("Openpose is available, but not yet implemented for mediapi_skel dataset.") E NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. sign_language_datasets\datasets\mediapi_skel\mediapi_skel.py:101: NotImplementedError -------------------------------------------------------------------- Captured log call -------------------------------------------------------------------- WARNING absl:dtype_utils.py:43 You use TensorFlow DType in tfds.features This will soon be deprecated in favor of NumPy DTypes. In the meantime it was converted to float32. __________________________________________________ MediapiSkelTest.test_download_and_prepare_as_dataset ___________________________________________________ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: in setUp self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _info(self) -> tfds.core.DatasetInfo: """Returns the dataset metadata.""" features = { "id": tfds.features.Text(), "metadata": { "fps": tf.int32, "height": tf.int32, "width": tf.int32, "duration": tf.float32, "frames": tf.int32, }, "subtitles": tfds.features.Sequence({ "start_time": tf.float32, "end_time": tf.float32, "text": tfds.features.Text(), }), } if self._builder_config.include_pose is not None: pose_header_path = _POSE_HEADERS[self._builder_config.include_pose] if self._builder_config.fps is not None: print("Pose FPS is not implemented for mediapi_skel dataset (since the original fps is not consistent)") if self._builder_config.include_pose == "openpose": pose_shape = (None, 1, 137, 2) > raise NotImplementedError("Openpose is available, but not yet implemented for mediapi_skel dataset.") E NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. sign_language_datasets\datasets\mediapi_skel\mediapi_skel.py:101: NotImplementedError ________________________________________________________________ MediapiSkelTest.test_info ________________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _info(self) -> tfds.core.DatasetInfo: """Returns the dataset metadata.""" features = { "id": tfds.features.Text(), "metadata": { "fps": tf.int32, "height": tf.int32, "width": tf.int32, "duration": tf.float32, "frames": tf.int32, }, "subtitles": tfds.features.Sequence({ "start_time": tf.float32, "end_time": tf.float32, "text": tfds.features.Text(), }), } if self._builder_config.include_pose is not None: pose_header_path = _POSE_HEADERS[self._builder_config.include_pose] if self._builder_config.fps is not None: print("Pose FPS is not implemented for mediapi_skel dataset (since the original fps is not consistent)") if self._builder_config.include_pose == "openpose": pose_shape = (None, 1, 137, 2) > raise NotImplementedError("Openpose is available, but not yet implemented for mediapi_skel dataset.") E NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. sign_language_datasets\datasets\mediapi_skel\mediapi_skel.py:101: NotImplementedError _____________________________________________________________ MediapiSkelTest.test_registered _____________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _info(self) -> tfds.core.DatasetInfo: """Returns the dataset metadata.""" features = { "id": tfds.features.Text(), "metadata": { "fps": tf.int32, "height": tf.int32, "width": tf.int32, "duration": tf.float32, "frames": tf.int32, }, "subtitles": tfds.features.Sequence({ "start_time": tf.float32, "end_time": tf.float32, "text": tfds.features.Text(), }), } if self._builder_config.include_pose is not None: pose_header_path = _POSE_HEADERS[self._builder_config.include_pose] if self._builder_config.fps is not None: print("Pose FPS is not implemented for mediapi_skel dataset (since the original fps is not consistent)") if self._builder_config.include_pose == "openpose": pose_shape = (None, 1, 137, 2) > raise NotImplementedError("Openpose is available, but not yet implemented for mediapi_skel dataset.") E NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. sign_language_datasets\datasets\mediapi_skel\mediapi_skel.py:101: NotImplementedError ___________________________________________________________ MediapiSkelTest.test_tags_are_valid ___________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _info(self) -> tfds.core.DatasetInfo: """Returns the dataset metadata.""" features = { "id": tfds.features.Text(), "metadata": { "fps": tf.int32, "height": tf.int32, "width": tf.int32, "duration": tf.float32, "frames": tf.int32, }, "subtitles": tfds.features.Sequence({ "start_time": tf.float32, "end_time": tf.float32, "text": tfds.features.Text(), }), } if self._builder_config.include_pose is not None: pose_header_path = _POSE_HEADERS[self._builder_config.include_pose] if self._builder_config.fps is not None: print("Pose FPS is not implemented for mediapi_skel dataset (since the original fps is not consistent)") if self._builder_config.include_pose == "openpose": pose_shape = (None, 1, 137, 2) > raise NotImplementedError("Openpose is available, but not yet implemented for mediapi_skel dataset.") E NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. sign_language_datasets\datasets\mediapi_skel\mediapi_skel.py:101: NotImplementedError ______________________________________________________________ MyDatasetTest.test_baseclass _______________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() sign_language_datasets\datasets\my_dataset\my_dataset_dataset_builder.py:17: in _info return self.dataset_info_from_configs( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1114: in dataset_info_from_configs metadata = self.get_metadata() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:241: in get_metadata return dataset_metadata.load(cls._get_pkg_dir_path()) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:83: in load raw_metadata = _read_files(pkg_path) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:106: in _read_files return utils.tree.parallel_map( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\utils\tree_utils.py:65: in parallel_map raise f.exception() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\concurrent\futures\thread.py:58: in run result = self.fn(*self.args, **self.kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:107: in lambda f: f.read_text(encoding="utf-8"), name2path ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\etils\epath\abstract_path.py:157: in read_text return f.read() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = input = b"// {info.todo}: remove tags which do not apply to dataset.\r\ncontent.data-type.3d # Contains 3d data.\r\ncontent.da...arning task.\r\nml.task.word-sense-disambiguation # Relates to Word Sense Disambiguation, a machine learning task.\r\n" final = True def decode(self, input, final=False): # decode input (taking the buffer into account) data = self.buffer + input > (result, consumed) = self._buffer_decode(data, self.errors, final) E UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\codecs.py:322: UnicodeDecodeError ___________________________________________________ MyDatasetTest.test_download_and_prepare_as_dataset ____________________________________________________ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: in setUp self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() sign_language_datasets\datasets\my_dataset\my_dataset_dataset_builder.py:17: in _info return self.dataset_info_from_configs( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1114: in dataset_info_from_configs metadata = self.get_metadata() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:241: in get_metadata return dataset_metadata.load(cls._get_pkg_dir_path()) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:83: in load raw_metadata = _read_files(pkg_path) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:106: in _read_files return utils.tree.parallel_map( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\utils\tree_utils.py:65: in parallel_map raise f.exception() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\concurrent\futures\thread.py:58: in run result = self.fn(*self.args, **self.kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:107: in lambda f: f.read_text(encoding="utf-8"), name2path ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\etils\epath\abstract_path.py:157: in read_text return f.read() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = input = b"// {info.todo}: remove tags which do not apply to dataset.\r\ncontent.data-type.3d # Contains 3d data.\r\ncontent.da...arning task.\r\nml.task.word-sense-disambiguation # Relates to Word Sense Disambiguation, a machine learning task.\r\n" final = True def decode(self, input, final=False): # decode input (taking the buffer into account) data = self.buffer + input > (result, consumed) = self._buffer_decode(data, self.errors, final) E UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\codecs.py:322: UnicodeDecodeError _________________________________________________________________ MyDatasetTest.test_info _________________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() sign_language_datasets\datasets\my_dataset\my_dataset_dataset_builder.py:17: in _info return self.dataset_info_from_configs( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1114: in dataset_info_from_configs metadata = self.get_metadata() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:241: in get_metadata return dataset_metadata.load(cls._get_pkg_dir_path()) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:83: in load raw_metadata = _read_files(pkg_path) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:106: in _read_files return utils.tree.parallel_map( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\utils\tree_utils.py:65: in parallel_map raise f.exception() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\concurrent\futures\thread.py:58: in run result = self.fn(*self.args, **self.kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:107: in lambda f: f.read_text(encoding="utf-8"), name2path ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\etils\epath\abstract_path.py:157: in read_text return f.read() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = input = b"// {info.todo}: remove tags which do not apply to dataset.\r\ncontent.data-type.3d # Contains 3d data.\r\ncontent.da...arning task.\r\nml.task.word-sense-disambiguation # Relates to Word Sense Disambiguation, a machine learning task.\r\n" final = True def decode(self, input, final=False): # decode input (taking the buffer into account) data = self.buffer + input > (result, consumed) = self._buffer_decode(data, self.errors, final) E UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\codecs.py:322: UnicodeDecodeError ______________________________________________________________ MyDatasetTest.test_registered ______________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() sign_language_datasets\datasets\my_dataset\my_dataset_dataset_builder.py:17: in _info return self.dataset_info_from_configs( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1114: in dataset_info_from_configs metadata = self.get_metadata() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:241: in get_metadata return dataset_metadata.load(cls._get_pkg_dir_path()) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:83: in load raw_metadata = _read_files(pkg_path) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:106: in _read_files return utils.tree.parallel_map( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\utils\tree_utils.py:65: in parallel_map raise f.exception() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\concurrent\futures\thread.py:58: in run result = self.fn(*self.args, **self.kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:107: in lambda f: f.read_text(encoding="utf-8"), name2path ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\etils\epath\abstract_path.py:157: in read_text return f.read() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = input = b"// {info.todo}: remove tags which do not apply to dataset.\r\ncontent.data-type.3d # Contains 3d data.\r\ncontent.da...arning task.\r\nml.task.word-sense-disambiguation # Relates to Word Sense Disambiguation, a machine learning task.\r\n" final = True def decode(self, input, final=False): # decode input (taking the buffer into account) data = self.buffer + input > (result, consumed) = self._buffer_decode(data, self.errors, final) E UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\codecs.py:322: UnicodeDecodeError ____________________________________________________________ MyDatasetTest.test_tags_are_valid ____________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: in _make_builder return self.dataset_class( # pylint: disable=not-callable ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1335: in __init__ super().__init__(**kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:288: in decorator return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:283: in __init__ self.info.initialize_from_bucket() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:472: in info info = self._info() sign_language_datasets\datasets\my_dataset\my_dataset_dataset_builder.py:17: in _info return self.dataset_info_from_configs( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1114: in dataset_info_from_configs metadata = self.get_metadata() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:241: in get_metadata return dataset_metadata.load(cls._get_pkg_dir_path()) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:83: in load raw_metadata = _read_files(pkg_path) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:106: in _read_files return utils.tree.parallel_map( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\utils\tree_utils.py:65: in parallel_map raise f.exception() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\concurrent\futures\thread.py:58: in run result = self.fn(*self.args, **self.kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_metadata.py:107: in lambda f: f.read_text(encoding="utf-8"), name2path ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\etils\epath\abstract_path.py:157: in read_text return f.read() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = input = b"// {info.todo}: remove tags which do not apply to dataset.\r\ncontent.data-type.3d # Contains 3d data.\r\ncontent.da...arning task.\r\nml.task.word-sense-disambiguation # Relates to Word Sense Disambiguation, a machine learning task.\r\n" final = True def decode(self, input, final=False): # decode input (taking the buffer into account) data = self.buffer + input > (result, consumed) = self._buffer_decode(data, self.errors, final) E UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\codecs.py:322: UnicodeDecodeError _____________________________________________ RWTHPhoenix2014TPosesTest.test_download_and_prepare_as_dataset ______________________________________________ self = FeaturesDict({ 'gloss': Text(shape=(), dtype=string), 'id': Text(shape=(), dtype=string), 'pose': PoseFeat...Info(shape=(), dtype=int32)}), 'signer': Text(shape=(), dtype=string), 'text': Text(shape=(), dtype=string), }) example_dict = {'gloss': 'REGEN SCHNEE REGION VERSCHWINDEN NORD REGEN KOENNEN REGION STERN KOENNEN SEHEN', 'id': '25October_2010_Mond..._phoenix2014_t\\dummy_data\\poses\\holistic\\dev\\25October_2010_Monday_tagesschau-17.pose', 'signer': 'Signer01', ...} def encode_example(self, example_dict): """See base class for details.""" example = {} for k, (feature, example_value) in utils.zip_dict( self._feature_dict, example_dict ): try: > example[k] = feature.encode_example(example_value) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\features\features_dict.py:241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ sign_language_datasets\utils\features\pose_feature.py:167: in encode_example encoded_pose = pose_f.read() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow\python\lib\io\file_io.py:116: in read self._preread_check() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _preread_check(self): if not self._read_buf: if not self._read_check_passed: raise errors.PermissionDeniedError(None, None, "File isn't open for reading") > self._read_buf = _pywrap_file_io.BufferedInputStream( compat.path_to_str(self.__name), 1024 * 512) E tensorflow.python.framework.errors_impl.NotFoundError: NewRandomAccessFile failed to Create/Open: C:\Users\Colin\projects\sign-language\colin_pull_requesting\datasets\sign_language_datasets\datasets\rwth_phoenix2014_t\dummy_data\poses\holistic\dev\25October_2010_Monday_tagesschau-17.pose : The system cannot find the path specified. E ; No such process ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow\python\lib\io\file_io.py:77: NotFoundError The above exception was the direct cause of the following exception: self = args = (), kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1600: in _download_and_prepare future = split_builder.submit_split_generation( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\split_builder.py:332: in submit_split_generation return self._build_from_generator(**build_kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\split_builder.py:403: in _build_from_generator utils.reraise(e, prefix=f'Failed to encode example:\n{example}\n') ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\split_builder.py:401: in _build_from_generator example = self._features.encode_example(example) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\features\features_dict.py:243: in encode_example utils.reraise( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ e = NotFoundError(), prefix = 'In with name "pose":\n', suffix = '' def reraise( e: Exception, prefix: Optional[str] = None, suffix: Optional[str] = None, ) -> NoReturn: """Reraise an exception with an additional message.""" prefix = prefix or '' suffix = '\n' + suffix if suffix else '' # If unsure about modifying the function inplace, create a new exception # and stack it in the chain. if ( # Exceptions with custom error message type(e).__str__ is not BaseException.__str__ # This should never happens unless the user plays with Exception # internals or not hasattr(e, 'args') or not isinstance(e.args, tuple) ): msg = f'{prefix}{e}{suffix}' # Could try to dynamically create a # `type(type(e).__name__, (ReraisedError, type(e)), {})`, but should be # carefull when nesting `reraise` as well as compatibility with external # code. # Some base exception class (ImportError, OSError) and subclasses ( # ModuleNotFoundError, FileNotFoundError) have custom `__str__` error # message. We re-raise those with same type to allow except in caller code. if isinstance(e, (ImportError, OSError)): exception = type(e)(msg) else: exception = RuntimeError(f'{type(e).__name__}: {msg}') > raise exception from e E RuntimeError: Failed to encode example: E {'id': '25October_2010_Monday_tagesschau-17', 'signer': 'Signer01', 'gloss': 'REGEN SCHNEE REGION VERSCHWINDEN NORD REGEN KOENNEN REGION STERN KOENNEN SEHEN', 'text': 'regen und schnee lassen an den alpen in der nacht nach im norden und nordosten fallen hier und da schauer sonst ist das klar', 'pose': 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\rwth_phoenix2014_t\\dummy_data\\poses\\holistic\\dev\\25October_2010_Monday_tagesschau-17.pose'} E NotFoundError: In with name "pose": E NewRandomAccessFile failed to Create/Open: C:\Users\Colin\projects\sign-language\colin_pull_requesting\datasets\sign_language_datasets\datasets\rwth_phoenix2014_t\dummy_data\poses\holistic\dev\25October_2010_Monday_tagesschau-17.pose : The system cannot find the path specified. E ; No such process ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\utils\py_utils.py:388: RuntimeError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 1 Testing config poses Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpl5s7ercs\rwth_phoenix2014_t\poses\3.0.0... ------------------------------------------------------------------ Captured stderr call ------------------------------------------------------------------- Generating splits...: 0%| | 0/3 [00:00, args = () kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): dataset_warning(self) annotations_path = dl_manager.download("https://sign2mint.de/api/entries/all/") local_videos = {} if self._builder_config.include_video and self._builder_config.process_video: > with open(annotations_path, "r", encoding="utf-8") as f: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\sign2mint\\dummy_data' sign_language_datasets\datasets\sign2mint\sign2mint.py:88: PermissionError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 2 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmp1scn4bma\sign2_mint\default\1.0.0... __________________________________________________ SignWordnetTest.test_download_and_prepare_as_dataset ___________________________________________________ self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): """Returns SplitGenerators.""" dataset_warning(self) try: > import nltk E ModuleNotFoundError: No module named 'nltk' sign_language_datasets\datasets\sign_wordnet\sign_wordnet.py:85: ModuleNotFoundError During handling of the above exception, another exception occurred: self = , args = () kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): """Returns SplitGenerators.""" dataset_warning(self) try: import nltk except ImportError: > raise ImportError("Please install nltk with: pip install nltk") E ImportError: Please install nltk with: pip install nltk sign_language_datasets\datasets\sign_wordnet\sign_wordnet.py:87: ImportError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 1 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpctwpn8jg\sign_wordnet\default\0.2.0... ____________________________________________________ SignbankTest.test_download_and_prepare_as_dataset ____________________________________________________ self = , args = () kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): """Returns SplitGenerators.""" dataset_warning(self) index = dl_manager.download("http://signbank.org/signpuddle2.0/data/spml/") regex = r"\"sgn[\d]+.spml\"" > with open(index, "r", encoding="utf-8") as f: E PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\datasets\\signbank\\dummy_data' sign_language_datasets\datasets\signbank\signbank.py:218: PermissionError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 1 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpc00zvkkx\sign_bank\default\1.0.0... ___________________________________________________ SignSuisseTest.test_download_and_prepare_as_dataset ___________________________________________________ self = , args = () kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): dataset_warning(self) print( "The lexicon is available free of charge, so we look forward to your donation! https://www.sgb-fss.ch/spenden/jetzt-spenden/") lexicon_items = self._list_all_lexicon_items(dl_manager) print("Found", len(lexicon_items), "lexicon items.") item_urls = [item["link"] for item in lexicon_items] if self._builder_config.include_video: items_pages = dl_manager.download(item_urls) else: items_pages = [None for _ in item_urls] data = [] for item, item_page in zip(lexicon_items, items_pages): try: item = self._parse_item(item, item_page) if item is not None: data.append(item) except Exception as e: print("Failed to parse item") print(item) print(item_page) print(e) print("\n\n\n\n\n\n") # raise e # Download videos if requested if self._builder_config.include_video: video_urls = [item["video"] for item in data] videos = dl_manager.download(video_urls) for datum, video in zip(data, videos): datum["video"] = video # PosixPath if not self._builder_config.process_video: datum["video"] = str(datum["video"]) # Download example videos if requested data_with_examples = [item for item in data if item["exampleVideo"] != ""] video_urls = [item["exampleVideo"] for item in data_with_examples] videos = dl_manager.download(video_urls) for datum, video in zip(data_with_examples, videos): datum["exampleVideo"] = video # PosixPath if not self._builder_config.process_video: datum["exampleVideo"] = str(datum["exampleVideo"]) if self._builder_config.include_pose is not None: poses_dir = dl_manager.download_and_extract(_POSE_URLS[self._builder_config.include_pose]) id_func = lambda opt: 'ss' + hashlib.md5(("signsuisse" + opt[0] + opt[1]).encode()).hexdigest() for datum in data: > pose_file = poses_dir.joinpath(id_func([datum["id"], "isolated"]) + ".pose") E AttributeError: 'list' object has no attribute 'joinpath' sign_language_datasets\datasets\signsuisse\signsuisse.py:227: AttributeError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 2 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmphq36hf34\sign_suisse\default\1.0.0... The lexicon is available free of charge, so we look forward to your donation! https://www.sgb-fss.ch/spenden/jetzt-spenden/ Found 1 lexicon items. Dataset sign_suisse downloaded and prepared to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmphq36hf34\sign_suisse\default\1.0.0. Subsequent calls will reuse this data. Testing config holistic Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmphq36hf34\sign_suisse\holistic\1.0.0... The lexicon is available free of charge, so we look forward to your donation! https://www.sgb-fss.ch/spenden/jetzt-spenden/ Found 1 lexicon items. ------------------------------------------------------------------ Captured stderr call ------------------------------------------------------------------- Generating splits...: 0%| | 0/1 [00:00, args = (), kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): """Returns SplitGenerators.""" dataset_warning(self) if "PHPSESSID" not in self._builder_config.extra: raise Exception( "Missing PHPSESSID extra parameter. Go to https://signtyp.uconn.edu/signpuddle/ and copy your PHPSESSID from any network request." ) cookies = {"PHPSESSID": self._builder_config.extra["PHPSESSID"]} headers = { "Connection": "keep-alive", "Cache-Control": "max-age=0", "sec-ch-ua": '" Not A;Brand";v="99", "Chromium";v="96", "Google Chrome";v="96"', "sec-ch-ua-mobile": "?0", "sec-ch-ua-platform": '"Windows"', "Upgrade-Insecure-Requests": "1", "Origin": "https://signtyp.uconn.edu", "Content-Type": "application/x-www-form-urlencoded", "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.45 Safari/537.36", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9", "Sec-Fetch-Site": "same-origin", "Sec-Fetch-Mode": "navigate", "Sec-Fetch-User": "?1", "Sec-Fetch-Dest": "document", "Referer": "https://signtyp.uconn.edu/signpuddle/export.php?ui=1&sgn=9032", "Accept-Language": "en-US,en;q=0.9,he;q=0.8", } data = {"ex_source": "All", "action": "Download"} res = requests.post("https://signtyp.uconn.edu/signpuddle/export.php", data=data, headers=headers, cookies=cookies) spml = res.text if not spml.startswith(''): > raise Exception("PHPSESSID might be expired.") E Exception: PHPSESSID might be expired. sign_language_datasets\datasets\signtyp\signtyp.py:97: Exception ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 1 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmpto6un68n\sign_typ\default\1.0.0... _________________________________________________ SwojsGlossarioTest.test_download_and_prepare_as_dataset _________________________________________________ self = args = (), kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = dl_manager = def _split_generators(self, dl_manager: tfds.download.DownloadManager): dataset_warning(self) > annotations_path, media_path = dl_manager.download( ["http://swojs.ibict.br/portal/api/items?page", "http://swojs.ibict.br/portal/api/media?page"] ) E TypeError: cannot unpack non-iterable WindowsGPath object sign_language_datasets\datasets\swojs_glossario\swojs_glossario.py:83: TypeError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 2 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmp_domd3fx\swojs_glossario\default\1.0.0... _____________________________________________________ WlaslTest.test_download_and_prepare_as_dataset ______________________________________________________ self = , args = (), kwargs = {} def decorated(self, *args, **kwargs): """Run the decorated test method.""" if not tf.executing_eagerly(): raise ValueError( 'Must be executing eagerly when using the ' 'run_in_graph_and_eager_modes decorator.' ) with self.subTest('eager_mode'): > f(self, *args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\test_utils.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:385: in test_download_and_prepare_as_dataset self._download_and_prepare_as_dataset(builder) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:462: in _download_and_prepare_as_dataset builder.download_and_prepare(download_config=download_config) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\logging\__init__.py:168: in __call__ return function(*args, **kwargs) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:689: in download_and_prepare self._download_and_prepare( ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\core\dataset_builder.py:1563: in _download_and_prepare split_generators = self._split_generators( # pylint: disable=unexpected-keyword-arg sign_language_datasets\datasets\wlasl\wlasl.py:107: in _split_generators data = json.load(f) ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\json\__init__.py:293: in load return loads(fp.read(), ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow\python\lib\io\file_io.py:116: in read self._preread_check() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _preread_check(self): if not self._read_buf: if not self._read_check_passed: raise errors.PermissionDeniedError(None, None, "File isn't open for reading") > self._read_buf = _pywrap_file_io.BufferedInputStream( compat.path_to_str(self.__name), 1024 * 512) E tensorflow.python.framework.errors_impl.UnknownError: NewRandomAccessFile failed to Create/Open: C:\Users\Colin\projects\sign-language\colin_pull_requesting\datasets\sign_language_datasets\datasets\wlasl\dummy_data : Access is denied. E ; Input/output error ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow\python\lib\io\file_io.py:77: UnknownError ------------------------------------------------------------------ Captured stdout call ------------------------------------------------------------------- Total configs: 1 Testing config default Downloading and preparing dataset Unknown size (download: Unknown size, generated: Unknown size, total: Unknown size) to C:\Users\Colin\AppData\Local\Temp\run0vq9m9l6\tmp8h9krcr4\wlasl\default\0.3.0... ------------------------------------------------------------------ Captured stderr call ------------------------------------------------------------------- 2024-02-21 14:22:12.560582: E external/local_tsl/tsl/platform/windows/windows_file_system.cc:363] ERROR: GetSymbolicLinkTarget cannot open file for \\?\C:\Users\Colin\projects\sign-language\colin_pull_requesting\datasets\sign_language_datasets\datasets\wlasl\dummy_data GetLastError: 5 ________________________________________________________________ WMTSLTTest.test_baseclass ________________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError _____________________________________________________ WMTSLTTest.test_download_and_prepare_as_dataset _____________________________________________________ ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: in setUp self.builder = self._make_builder() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError __________________________________________________________________ WMTSLTTest.test_info ___________________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError _______________________________________________________________ WMTSLTTest.test_registered ________________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError _____________________________________________________________ WMTSLTTest.test_tags_are_valid ______________________________________________________________ self = def setUp(self): super(DatasetBuilderTestCase, self).setUp() self.patchers = [] > self.builder = self._make_builder() ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , config = None def _make_builder(self, config=None): > return self.dataset_class( # pylint: disable=not-callable data_dir=self.tmp_dir, config=config, version=self.VERSION ) E TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' ..\..\..\..\miniconda3\envs\sign_language_datasets_source\lib\site-packages\tensorflow_datasets\testing\dataset_builder_testing.py:350: TypeError _______________________________________________________ TestOCR.test_should_extract_fsw_from_image ________________________________________________________ self = def test_should_extract_fsw_from_image(self): dirname = os.path.dirname(__file__) img_path = os.path.join(dirname, "assets/sign.png") img_rgb = cv2.imread(img_path) symbols = ["S1f520", "S1f528", "S23c04", "S23c1c", "S2fb04", "S2ff00", "S33b10"] > self.assertEqual( image_to_fsw(img_rgb, symbols), "M239x127S2ff00043x057S23c04071x118S23c1c028x118S1f520062x100S1f528035x100S2fb04054x181S33b10054x083", ) E AssertionError: 'M500x500S2ff00423x493S23c04451x554S23c1c40[53 chars]x519' != 'M239x127S2ff00043x057S23c04071x118S23c1c02[53 chars]x083' E - M500x500S2ff00423x493S23c04451x554S23c1c408x554S1f520442x536S1f528415x536S2fb04434x617S33b10434x519 E + M239x127S2ff00043x057S23c04071x118S23c1c028x118S1f520062x100S1f528035x100S2fb04054x181S33b10054x083 sign_language_datasets\utils\signwriting\ocr\ocr_test.py:17: AssertionError ==================================================================== warnings summary ===================================================================== sign_language_datasets/datasets/asl_lex/asl_lex_test.py: 1 warning sign_language_datasets/datasets/autsl/autsl_test.py: 1 warning sign_language_datasets/datasets/chicagofswild/chicagofswild_test.py: 1 warning sign_language_datasets/datasets/dgs_types/dgs_types_test.py: 1 warning sign_language_datasets/datasets/dicta_sign/dicta_sign_test.py: 1 warning sign_language_datasets/datasets/how2sign/how2sign_test.py: 1 warning sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py: 2 warnings sign_language_datasets/datasets/sign2mint/sign2mint_test.py: 1 warning sign_language_datasets/datasets/sign_wordnet/sign_wordnet_test.py: 1 warning sign_language_datasets/datasets/signbank/signbank_test.py: 1 warning sign_language_datasets/datasets/signsuisse/signsuisse_test.py: 1 warning sign_language_datasets/datasets/signtyp/signtyp_test.py: 1 warning sign_language_datasets/datasets/swojs_glossario/swojs_glossario_test.py: 1 warning sign_language_datasets/datasets/wlasl/wlasl_test.py: 1 warning C:\Users\Colin\projects\sign-language\colin_pull_requesting\datasets\sign_language_datasets\datasets\warning.py:5: UserWarning: This library provides access to data sets without claiming ownership over them or defining their licensing terms. Users who download data are responsible for checking the license of each individual data set. warnings.warn( sign_language_datasets/datasets/aslg_pc12/aslg_pc12_test.py::AslgPc12Test::test_download_and_prepare_as_dataset C:\Users\Colin\miniconda3\envs\sign_language_datasets_source\lib\site-packages\sign_language_datasets\datasets\warning.py:5: UserWarning: This library provides access to data sets without claiming ownership over them or defining their licensing terms. Users who download data are responsible for checking the license of each individual data set. warnings.warn( -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ---------- coverage: platform win32, python 3.10.13-final-0 ---------- Name Stmts Miss Cover --------------------------------------------------------------------------------------------------- sign_language_datasets\__init__.py 0 0 100% sign_language_datasets\datasets\__init__.py 21 0 100% sign_language_datasets\datasets\asl_lex\__init__.py 1 0 100% sign_language_datasets\datasets\asl_lex\asl_lex.py 27 4 85% sign_language_datasets\datasets\asl_lex\asl_lex_test.py 7 1 86% sign_language_datasets\datasets\aslg_pc12\__init__.py 1 0 100% sign_language_datasets\datasets\aslg_pc12\aslg_pc12.py 21 0 100% sign_language_datasets\datasets\aslg_pc12\aslg_pc12_test.py 8 1 88% sign_language_datasets\datasets\autsl\__init__.py 1 0 100% sign_language_datasets\datasets\autsl\autsl.py 124 65 48% sign_language_datasets\datasets\autsl\autsl_test.py 7 1 86% sign_language_datasets\datasets\bsl_corpus\__init__.py 1 0 100% sign_language_datasets\datasets\bsl_corpus\bsl_corpus.py 68 47 31% sign_language_datasets\datasets\bsl_corpus\bsl_corpus_test.py 7 1 86% sign_language_datasets\datasets\bsl_corpus\bsl_corpus_utils.py 188 155 18% sign_language_datasets\datasets\chicagofswild\__init__.py 1 0 100% sign_language_datasets\datasets\chicagofswild\chicagofswild.py 68 24 65% sign_language_datasets\datasets\chicagofswild\chicagofswild_test.py 7 1 86% sign_language_datasets\datasets\config.py 43 6 86% sign_language_datasets\datasets\dgs_corpus\__init__.py 1 0 100% sign_language_datasets\datasets\dgs_corpus\create_index.py 22 22 0% sign_language_datasets\datasets\dgs_corpus\dgs_corpus.py 214 163 24% sign_language_datasets\datasets\dgs_corpus\dgs_corpus_test.py 80 10 88% sign_language_datasets\datasets\dgs_corpus\dgs_utils.py 59 59 0% sign_language_datasets\datasets\dgs_corpus\splits\__init__.py 0 0 100% sign_language_datasets\datasets\dgs_corpus\splits\create_document_split.py 47 47 0% sign_language_datasets\datasets\dgs_types\__init__.py 1 0 100% sign_language_datasets\datasets\dgs_types\dgs_types.py 113 79 30% sign_language_datasets\datasets\dgs_types\dgs_types_test.py 7 1 86% sign_language_datasets\datasets\dgs_types\make_poses.py 45 45 0% sign_language_datasets\datasets\dicta_sign\__init__.py 1 0 100% sign_language_datasets\datasets\dicta_sign\dicta_sign.py 63 27 57% sign_language_datasets\datasets\dicta_sign\dicta_sign_test.py 7 1 86% sign_language_datasets\datasets\how2sign\__init__.py 1 0 100% sign_language_datasets\datasets\how2sign\how2sign.py 51 13 75% sign_language_datasets\datasets\how2sign\how2sign_test.py 7 1 86% sign_language_datasets\datasets\mediapi_skel\__init__.py 1 0 100% sign_language_datasets\datasets\mediapi_skel\create_pose_headers.py 11 11 0% sign_language_datasets\datasets\mediapi_skel\mediapi_skel.py 45 19 58% sign_language_datasets\datasets\mediapi_skel\mediapi_skel_test.py 7 1 86% sign_language_datasets\datasets\mediapi_skel\mediapi_utils.py 83 73 12% sign_language_datasets\datasets\my_dataset\__init__.py 0 0 100% sign_language_datasets\datasets\my_dataset\my_dataset_dataset_builder.py 12 4 67% sign_language_datasets\datasets\my_dataset\my_dataset_dataset_builder_test.py 7 1 86% sign_language_datasets\datasets\ngt_corpus\__init__.py 1 0 100% sign_language_datasets\datasets\ngt_corpus\create_index.py 88 88 0% sign_language_datasets\datasets\ngt_corpus\ngt_corpus.py 100 72 28% sign_language_datasets\datasets\ngt_corpus\ngt_corpus_test.py 22 0 100% sign_language_datasets\datasets\ngt_corpus\ngt_corpus_utils.py 24 2 92% sign_language_datasets\datasets\rwth_phoenix2014_t\__init__.py 1 0 100% sign_language_datasets\datasets\rwth_phoenix2014_t\rwth_phoenix2014_t.py 55 0 100% sign_language_datasets\datasets\rwth_phoenix2014_t\rwth_phoenix2014_t_test.py 16 1 94% sign_language_datasets\datasets\sign2mint\__init__.py 1 0 100% sign_language_datasets\datasets\sign2mint\sign2mint.py 60 30 50% sign_language_datasets\datasets\sign2mint\sign2mint_test.py 7 1 86% sign_language_datasets\datasets\sign_wordnet\__init__.py 1 0 100% sign_language_datasets\datasets\sign_wordnet\sign_wordnet.py 59 37 37% sign_language_datasets\datasets\sign_wordnet\sign_wordnet_test.py 7 1 86% sign_language_datasets\datasets\signbank\__init__.py 1 0 100% sign_language_datasets\datasets\signbank\signbank.py 56 29 48% sign_language_datasets\datasets\signbank\signbank_test.py 7 1 86% sign_language_datasets\datasets\signsuisse\__init__.py 1 0 100% sign_language_datasets\datasets\signsuisse\download_holistic_gcs.py 67 67 0% sign_language_datasets\datasets\signsuisse\example.py 27 27 0% sign_language_datasets\datasets\signsuisse\signsuisse.py 125 52 58% sign_language_datasets\datasets\signsuisse\signsuisse_test.py 8 1 88% sign_language_datasets\datasets\signtyp\__init__.py 1 0 100% sign_language_datasets\datasets\signtyp\signtyp.py 35 8 77% sign_language_datasets\datasets\signtyp\signtyp_test.py 7 1 86% sign_language_datasets\datasets\swojs_glossario\__init__.py 1 0 100% sign_language_datasets\datasets\swojs_glossario\swojs_glossario.py 61 38 38% sign_language_datasets\datasets\swojs_glossario\swojs_glossario_test.py 7 1 86% sign_language_datasets\datasets\warning.py 3 0 100% sign_language_datasets\datasets\wlasl\__init__.py 1 0 100% sign_language_datasets\datasets\wlasl\wlasl.py 65 34 48% sign_language_datasets\datasets\wlasl\wlasl_test.py 7 1 86% sign_language_datasets\datasets\wmt_slt\__init__.py 1 0 100% sign_language_datasets\datasets\wmt_slt\utils.py 95 74 22% sign_language_datasets\datasets\wmt_slt\wmt_slt.py 102 74 27% sign_language_datasets\datasets\wmt_slt\wmt_slt_test.py 7 1 86% sign_language_datasets\utils\__init__.py 0 0 100% sign_language_datasets\utils\downloaders\__init__.py 0 0 100% sign_language_datasets\utils\downloaders\aslpro.py 11 7 36% sign_language_datasets\utils\downloaders\youtube.py 8 7 12% sign_language_datasets\utils\features\__init__.py 1 0 100% sign_language_datasets\utils\features\pose_feature.py 91 43 53% sign_language_datasets\utils\get_pose_header.py 8 8 0% sign_language_datasets\utils\signwriting\__init__.py 0 0 100% sign_language_datasets\utils\signwriting\ocr\__init__.py 1 0 100% sign_language_datasets\utils\signwriting\ocr\ocr.py 89 74 17% sign_language_datasets\utils\signwriting\ocr\ocr_test.py 11 0 100% sign_language_datasets\utils\torch_dataset.py 26 26 0% --------------------------------------------------------------------------------------------------- TOTAL 2859 1689 41% ================================================================= short test summary info ================================================================= FAILED sign_language_datasets/datasets/asl_lex/asl_lex_test.py::AslLexTest::test_download_and_prepare_as_dataset - PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\dat... FAILED sign_language_datasets/datasets/autsl/autsl_test.py::AutslTest::test_download_and_prepare_as_dataset - AssertionError: Do not use `os`, but `tf.io.gfile` module instead. This makes code compatible with more filesystems. FAILED sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_baseclass - TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' FAILED sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_download_and_prepare_as_dataset - TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' FAILED sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_info - TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' FAILED sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_registered - TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' FAILED sign_language_datasets/datasets/bsl_corpus/bsl_corpus_test.py::BslCorpusTest::test_tags_are_valid - TypeError: BslCorpus.__init__() missing 2 required positional arguments: 'bslcp_username' and 'bslcp_password' FAILED sign_language_datasets/datasets/chicagofswild/chicagofswild_test.py::ChicagoFSWildTest::test_download_and_prepare_as_dataset - FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_data... FAILED sign_language_datasets/datasets/dgs_corpus/dgs_corpus_test.py::TestDgsCorpusAuxiliaryFunctions::test_get_poses_return_type - PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmp6sc7ayxv' FAILED sign_language_datasets/datasets/dgs_corpus/dgs_corpus_test.py::TestDgsCorpusAuxiliaryFunctions::test_get_poses_subset_of_camera_names - PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\AppData\\Local\\Temp\\tmpqgafthwv' FAILED sign_language_datasets/datasets/dgs_types/dgs_types_test.py::DGSTypesTest::test_download_and_prepare_as_dataset - TypeError: 'WindowsGPath' object is not iterable FAILED sign_language_datasets/datasets/dicta_sign/dicta_sign_test.py::DictaSignTest::test_download_and_prepare_as_dataset - PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\dat... FAILED sign_language_datasets/datasets/how2sign/how2sign_test.py::How2signTest::test_download_and_prepare_as_dataset - TypeError: 'WindowsGPath' object is not iterable FAILED sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_baseclass - NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. FAILED sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_download_and_prepare_as_dataset - NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. FAILED sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_info - NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. FAILED sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_registered - NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. FAILED sign_language_datasets/datasets/mediapi_skel/mediapi_skel_test.py::MediapiSkelTest::test_tags_are_valid - NotImplementedError: Openpose is available, but not yet implemented for mediapi_skel dataset. FAILED sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_baseclass - UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte FAILED sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_download_and_prepare_as_dataset - UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte FAILED sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_info - UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte FAILED sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_registered - UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte FAILED sign_language_datasets/datasets/my_dataset/my_dataset_dataset_builder_test.py::MyDatasetTest::test_tags_are_valid - UnicodeDecodeError: 'utf-8' codec can't decode byte 0xed in position 4352: invalid continuation byte FAILED sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t_test.py::RWTHPhoenix2014TPosesTest::test_download_and_prepare_as_dataset - RuntimeError: Failed to encode example: FAILED sign_language_datasets/datasets/sign2mint/sign2mint_test.py::Sign2MINTTest::test_download_and_prepare_as_dataset - PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\dat... FAILED sign_language_datasets/datasets/sign_wordnet/sign_wordnet_test.py::SignWordnetTest::test_download_and_prepare_as_dataset - ImportError: Please install nltk with: pip install nltk FAILED sign_language_datasets/datasets/signbank/signbank_test.py::SignbankTest::test_download_and_prepare_as_dataset - PermissionError: [Errno 13] Permission denied: 'C:\\Users\\Colin\\projects\\sign-language\\colin_pull_requesting\\datasets\\sign_language_datasets\\dat... FAILED sign_language_datasets/datasets/signsuisse/signsuisse_test.py::SignSuisseTest::test_download_and_prepare_as_dataset - AttributeError: 'list' object has no attribute 'joinpath' FAILED sign_language_datasets/datasets/signtyp/signtyp_test.py::SignTypTest::test_download_and_prepare_as_dataset - Exception: PHPSESSID might be expired. FAILED sign_language_datasets/datasets/swojs_glossario/swojs_glossario_test.py::SwojsGlossarioTest::test_download_and_prepare_as_dataset - TypeError: cannot unpack non-iterable WindowsGPath object FAILED sign_language_datasets/datasets/wlasl/wlasl_test.py::WlaslTest::test_download_and_prepare_as_dataset - tensorflow.python.framework.errors_impl.UnknownError: NewRandomAccessFile failed to Create/Open: C:\Users\Colin\projects\sign-language\colin_pull_requesting\datasets\sign_language_datasets\datasets\wlasl\dummy_data : Access is denied. FAILED sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_baseclass - TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' FAILED sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_download_and_prepare_as_dataset - TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' FAILED sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_info - TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' FAILED sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_registered - TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' FAILED sign_language_datasets/datasets/wmt_slt/wmt_slt_test.py::WMTSLTTest::test_tags_are_valid - TypeError: WMTSLT.__init__() missing 3 required positional arguments: 'zenodo_srf_poses_token', 'zenodo_srf_videos_token', and 'zenodo_focusnews_token' FAILED sign_language_datasets/utils/signwriting/ocr/ocr_test.py::TestOCR::test_should_extract_fsw_from_image - AssertionError: 'M500x500S2ff00423x493S23c04451x554S23c1c40[53 chars]x519' != 'M239x127S2ff00043x057S23c04071x118S23c1c02[53 chars]x083' ================================================= 37 failed, 69 passed, 20 skipped, 16 warnings in 14.78s ================================================= (sign_language_datasets_source) C:\Users\Colin\projects\sign-language\colin_pull_requesting\datasets> ```
AmitMY commented 4 months ago

Just dropping notes here, seems like:

  1. we need to use tf.io.gfile in more places
  2. for signbank, we are opening a file, which is fine, but for sign2mint, the index URL we use (https://sign2mint.de/api/entries/all/) is no longer valid
  3. Most importantly, DL_EXTRACT_RESULT is missing in many test files, which should point to dummy_data/some file