replit / ReplitLM

Inference code and configs for the ReplitLM model family
https://huggingface.co/replit
Apache License 2.0
934 stars 83 forks source link

Using replit_lm_tokenizer locally #30

Closed Chorleon closed 1 year ago

Chorleon commented 1 year ago

Hi community, I just got familiar with LLM recently so sorry if my question doesn't make sense.

As my work requires I need to process data locally, is there a way to use replit_lm_tokenizer locally instead having to set trust_remote_code = True as described in ReadME. Thanks a lot.

gjmulder commented 1 year ago

AFAIK this option is required to run the model locally as you are explicitly deciding to trust the code contained in the replit_lm_tokenizer. "remote" is being used in the sense that the code wasn't written by you but is part of the replit_lm_tokenizer you have downloaded from remote repository.