Closed Juhong-Namgung closed 2 months ago
i have same problem
i try different settings but isnt work for me(
i also try set TRUST_REMOTE_CODE = 1 but dont work
I stumbled upon the same problem. Looking through the code it turns out the documentation is incorrect. Set the environment variable to true
manually and it's passed correctly.
Fixed, Closing this.
I tried to run the new version of Worker vLLM:
runpod/worker-v1-vllm:stable-cuda12.1.0
Despite setting
trust_remote_code
to True (1), the setting does not seem to apply, as indicated by the error below. (Note: The Llama 3.1 model runs correctly, but the trust_remote_code True setting is still not applied.)My Configuration and Error: