facebookresearch / TaBERT

This repository contains source code for the TaBERT model, a pre-trained language model for learning joint representations of natural language utterances and (semi-)structured tables for semantic parsing. TaBERT is pre-trained on a massive corpus of 26M Web tables and their associated natural language context, and could be used as a drop-in replacement of a semantic parsers original encoder to compute representations for utterances and table schemas (columns).
Other
580 stars 63 forks source link

Puzzling line of "attention_mask = (1.0 - attention_mask) * -10000.0" in the vertical_transform function #26

Open superctj opened 2 years ago

superctj commented 2 years ago

Thank you for open-sourcing this work! I was looking at the vertical attention code and feel confused about this line of code. Could you please share any insights into it?