facebookresearch / TaBERT

This repository contains source code for the TaBERT model, a pre-trained language model for learning joint representations of natural language utterances and (semi-)structured tables for semantic parsing. TaBERT is pre-trained on a massive corpus of 26M Web tables and their associated natural language context, and could be used as a drop-in replacement of a semantic parsers original encoder to compute representations for utterances and table schemas (columns).
Other
580 stars 63 forks source link

Update vanilla_table_bert.py #5

Open monk1337 opened 3 years ago

monk1337 commented 3 years ago

Details of the issue:

this argument was removed in the latest version of the transformers, it's throwing error with this argument.

def forward(self, input_ids, token_type_ids=None, attention_mask=None, masked_lm_labels=None, **kwargs):
        sequence_output, _ = self._bert_model.bert(input_ids, token_type_ids, attention_mask, output_all_encoded_layers=False)

`using with that argument giving this error :

TypeError: forward() got an unexpected keyword argument 'output_all_encoded_layers'

Removing this argument works fine.

facebook-github-bot commented 3 years ago

Hi @monk1337!

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file.

In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks!

facebook-github-bot commented 3 years ago

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks!