fe1ixxu / BiBERT

This is the repository of the EMNLP 2021 paper "BERT, mBERT, or BiBERT? A Study on Contextualized Embeddings for Neural Machine Translation".
MIT License
30 stars 6 forks source link

How to get logits? #1

Closed nihirv closed 2 years ago

nihirv commented 2 years ago

Hi there.

Thank you for releasing this code.

I'm looking to get the logits of the model predictions.

It seems like the starter code you've posted on the README (i.e. using HuggingFace) only provides the hidden states.

I'm wondering if you'd be able to release the weights so I can run some generation (without re-training the model)?

Thanks

fe1ixxu commented 2 years ago

It might be an implementation question on the huggingface side. Maybe you can try loading the model by MaskedLM class to get logits output: https://huggingface.co/docs/transformers/main_classes/output#transformers.modeling_outputs.MaskedLMOutput.

yugaljain1999 commented 1 year ago

Hey @fe1ixxu @alexeib @nihirv I am unable to generate translated text from contextual embeddings(as per code shared in README).. I tried to get logits output but didn't get required translation even after using softmax layer on the logits.. So can you please share exact code to generate logits and corresponding translation from english to german? It would be really appreciable.

Thanks