Closed Chorleon closed 1 year ago
AFAIK this option is required to run the model locally as you are explicitly deciding to trust the code contained in the replit_lm_tokenizer. "remote" is being used in the sense that the code wasn't written by you but is part of the replit_lm_tokenizer you have downloaded from remote repository.
Hi community, I just got familiar with LLM recently so sorry if my question doesn't make sense.
As my work requires I need to process data locally, is there a way to use replit_lm_tokenizer locally instead having to set trust_remote_code = True as described in ReadME. Thanks a lot.