Table Transformer (TATR) is a deep learning model for extracting tables from unstructured documents (PDFs and images). This is also the official repository for the PubTables-1M dataset and GriTS evaluation metric.
MIT License
2.01k
stars
231
forks
source link
Update in inference.py -- omitting the optimizer's state #165
During the inference phase when loading a trained model, there's a focus on exclusively loading the model's weights while disregarding or not loading the optimizer's state. This adjustment can be particularly beneficial when utilizing a pre-trained model for tasks such as inference or transfer learning, where the optimizer's state information might not be necessary.
During the inference phase when loading a trained model, there's a focus on exclusively loading the model's weights while disregarding or not loading the optimizer's state. This adjustment can be particularly beneficial when utilizing a pre-trained model for tasks such as inference or transfer learning, where the optimizer's state information might not be necessary.