Closed aalok-sathe closed 2 years ago
currently not ready to merge:
Traceback (most recent call last):
File "examples/test_mean_froi_pereira2018_firstsessions.py", line 132, in <module>
main()
File "examples/test_mean_froi_pereira2018_firstsessions.py", line 107, in main
ann_modelcard = ann_enc.get_modelcard()
File "/home/aalok/code/langbrainscore/langbrainscore/encoder/ann.py", line 271, in get_modelcard
config_specs = {k: d_config[k] for k in config_specs_of_interest}
File "/home/aalok/code/langbrainscore/langbrainscore/encoder/ann.py", line 271, in <dictcomp>
config_specs = {k: d_config[k] for k in config_specs_of_interest}
KeyError: 'n_layer'
I think this is related to GPT vs BERT; will get back to this
#e066d6f solves this:
currently not ready to merge:
Traceback (most recent call last): File "examples/test_mean_froi_pereira2018_firstsessions.py", line 132, in <module> main() File "examples/test_mean_froi_pereira2018_firstsessions.py", line 107, in main ann_modelcard = ann_enc.get_modelcard() File "/home/aalok/code/langbrainscore/langbrainscore/encoder/ann.py", line 271, in get_modelcard config_specs = {k: d_config[k] for k in config_specs_of_interest} File "/home/aalok/code/langbrainscore/langbrainscore/encoder/ann.py", line 271, in <dictcomp> config_specs = {k: d_config[k] for k in config_specs_of_interest} KeyError: 'n_layer'
I think this is related to GPT vs BERT; will get back to this
This code chunk provides a first-draft implementation of tokenizer-backed indices for extracting an appropriate subset of representations from reprs obtained using a larger context