This repository contains source code for the TaBERT model, a pre-trained language model for learning joint representations of natural language utterances and (semi-)structured tables for semantic parsing. TaBERT is pre-trained on a massive corpus of 26M Web tables and their associated natural language context, and could be used as a drop-in replacement of a semantic parsers original encoder to compute representations for utterances and table schemas (columns).
Other
580
stars
63
forks
source link
Column embedding and context embedding value changes every time at inference #25
how is the column embedding and the context embedding value changing every time we run encode(with pre-trained model) if we are running on the same table with the same context? (We tried with just one row in the table, and the embedding changes every time)
For example:
For the same table and same context, we get the following two embeddings.
For example: For the same table and same context, we get the following two embeddings.
tensor([-6.5471e-02, -1.1527e-01, 5.5206e-01, -3.9907e-02, 4.8929e-01, 3.5280e-01, 1.8725e-01, 8.9385e-01, -1.9507e-01, 2.4612e-01,
tensor([-1.6841e-01, -2.1637e-02, 5.4372e-01, -1.2783e-02, 2.2790e-01, 8.3290e-02, 4.2798e-01, 7.9061e-01, -3.3040e-01, 3.3663e-01,