Closed J-Curwell closed 2 weeks ago
Can confirm - have the exact same issue. I have tried parsing the tokens directly and that doesn't work.
If I downgrade to 3.2.1 then it works like a charm
Feel free to stick with 3.2.1 for now, unless you'd like to try out some of the new v3.3.0 features like Training with Prompts, NanoBEIR Evaluation, PEFT, Int8 quantization with OpenVINO, etc. In that case, you can download the bleeding edge version until I bring out v3.3.1:
pip install git+https://github.com/UKPLab/sentence-transformers.git
Thanks for reporting this!
Issue:
We have a fine-tuned version of ember-v1 (https://huggingface.co/llmrails/ember-v1) hosted as a private model in our Hugging Face org. We could successfully load this model in the previous release of sentence-transformers (3.2.1) but as of 3.3.0 this same code no longer works.
Steps to reproduce:
HF_TOKEN
environment variable set. Our architecture reads this directly from a secret manager rather than setting as an environment variableargs = { "token":,
"trust_remote_code": False,
"revision": None,
"local_files_only": False
}
transformer = Transformer( model_name_or_path=,
cache_dir=None,
backend="torch",
max_seq_length=512,
do_lower_case=True,
model_args=args,
tokenizer_args=args,
config_args=args
)