Closed yiyan32 closed 3 weeks ago
turn off your vpn
turn off your vpn
it works,tks a lot!
If you set proxy, you need to set HTTP_PROXY
, HTTPS_PROXY
and NO_PROXY
environment variables.
import os
from urllib.request import getproxies
proxies = getproxies()
os.environ["http_proxy"] = proxies["http"]
os.environ["https_proxy"] = proxies["https"]
os.environ["no_proxy"] = "localhost, 127.0.0.1/8, ::1"
(Omost_3.10.11) PS C:\Windows\system32> python C:\Colony\Omost\gradio_app.py C:\Colony\Omost\lib_omost\pipeline.py:64: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requiresgrad(True), rather than torch.tensor(sourceTensor). alphas_cumprod = torch.tensor(np.cumprod(alphas, axis=0), dtype=torch.float32) Unload to CPU: AutoencoderKL Unload to CPU: CLIPTextModel Unload to CPU: CLIPTextModel Unload to CPU: UNet2DConditionModel Unused kwargs: ['_load_in_4bit', '_load_in_8bit', 'quant_method']. These kwargs are not used in <class 'transformers.utils.quantization_config.BitsAndBytesConfig'>. Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 2/2 [00:02<00:00, 1.10s/it] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. You shouldn't move a model that is dispatched using accelerate hooks. Unload to CPU: LlamaForCausalLM Running on local URL: http://0.0.0.0:7860 Traceback (most recent call last): File "C:\Colony\Omost\gradio_app.py", line 382, in
demo.queue().launch(inbrowser=True, server_name='0.0.0.0')
File "C:\ProgramData\miniconda3\envs\Omost_3.10.11\lib\site-packages\gradio\blocks.py", line 2375, in launch
raise ValueError(
ValueError: When localhost is not accessible, a shareable link must be created. Please set share=True or check your proxy settings to allow access to localhost.