Closed VeritasJoker closed 1 month ago
option 1: flag to force glove embeddings instead of currently loaded model. be sure to remove tokens/words without glove embeddings. option 2: get pickling to generate a glove pickle with language model predictions
When we run encoding with glove embeddings but only using words with predictions from gpt2 (correct/incorrect/top30%/bot30%), we need to load gpt2 datums. Now the code is very hackish. Need a better method