I'm reaching out to inquire about the pre-trained ICT models, specifically the weights for both the question encoder BERT_Q(q) and the block encoder BERT_B(b).
I came across the repository "https://github.com/google-research/language/tree/master/language/orqa" on GitHub, which suggests the site "gs://orqa-data/ict" as a potential source for these weights.
Upon checking, I found that "gs://orqa-data/ict" provides the pre-trained weight for only the question encoder BERT_Q(q) and a dense vector index of size (13353718, 128).
To be precise, the weight for BERT_Q(q) is located in "gs://orqa-data/ict/variables", and the dense vector index is in "gs://orqa-data/ict/encoded". However, I couldn't locate the pre-trained weight for the block encoder BERT_B(b).
Would it be possible for you to share the pre-trained weight of the block encoder BERT_B(b), if available?
I'm reaching out to inquire about the pre-trained ICT models, specifically the weights for both the question encoder BERT_Q(q) and the block encoder BERT_B(b).
I came across the repository "https://github.com/google-research/language/tree/master/language/orqa" on GitHub, which suggests the site "gs://orqa-data/ict" as a potential source for these weights. Upon checking, I found that "gs://orqa-data/ict" provides the pre-trained weight for only the question encoder BERT_Q(q) and a dense vector index of size (13353718, 128). To be precise, the weight for BERT_Q(q) is located in "gs://orqa-data/ict/variables", and the dense vector index is in "gs://orqa-data/ict/encoded". However, I couldn't locate the pre-trained weight for the block encoder BERT_B(b).
Would it be possible for you to share the pre-trained weight of the block encoder BERT_B(b), if available?
I appreciate your assistance.
Thank you.