AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
142.93k stars 26.94k forks source link

[Bug]: WARNING: Invalid HTTP request received. #14298

Open Sir20000 opened 11 months ago

Sir20000 commented 11 months ago

Is there an existing issue for this?

What happened?

if I use an Apache proxy I come across the error WARNING: Invalid HTTP request received. How to solve this

Steps to reproduce the problem

1.create a revers proxy with apache 2.try connecte

  1. Check a log

What should have happened?

I should have had the page without problem Error Connection errored out.

Sysinfo

.

What browsers do you use to access the UI ?

No response

Console logs

venv "D:\ai\venv\Scripts\Python.exe"
==============================================================================================================
INCOMPATIBLE PYTHON VERSION

This program is tested with 3.10.6 Python, but you have 3.11.1.
If you encounter an error with "RuntimeError: Couldn't install torch." message,
or any other error regarding unsuccessful package (library) installation,
please downgrade (or upgrade) to the latest version of 3.10 Python
and delete current Python and "venv" folder in WebUI's directory.

You can download 3.10 Python from here: https://www.python.org/downloads/release/python-3106/

Alternatively, use a binary release of WebUI: https://github.com/AUTOMATIC1111/stable-diffusion-webui/releases

Use --skip-python-version-check to suppress this warning.
==============================================================================================================
fatal: not a git repository (or any of the parent directories): .git
fatal: not a git repository (or any of the parent directories): .git
Python 3.11.1 (tags/v3.11.1:a7a450f, Dec  6 2022, 19:58:39) [MSC v.1934 64 bit (AMD64)]
Version: 1.6.1
Commit hash: <none>
Launching Web UI with arguments:
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
No module 'xformers'. Proceeding without it.
Loading weights [6ce0161689] from D:\ai\models\Stable-diffusion\v1-5-pruned-emaonly.safetensors
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
Creating model from config: D:\ai\configs\v1-inference.yaml
Startup time: 43.1s (prepare environment: 23.6s, import torch: 7.2s, import gradio: 2.8s, setup paths: 2.5s, initialize shared: 0.5s, other imports: 2.0s, setup codeformer: 0.2s, load scripts: 2.4s, create ui: 0.7s, gradio launch: 1.2s).
Applying attention optimization: Doggettx... done.
Model loaded in 43.0s (load weights from disk: 1.2s, create model: 1.6s, apply weights to model: 38.9s, apply half(): 0.2s, load VAE: 0.2s, calculate empty prompt: 0.6s).
WARNING:  Invalid HTTP request received.
WARNING:  Invalid HTTP request received.
WARNING:  Invalid HTTP request received.
WARNING:  Invalid HTTP request received.
WARNING:  Invalid HTTP request received.
WARNING:  Invalid HTTP request received.

Additional information

No response

rolandwellinger commented 10 months ago

Any solution for this. I've the exact same problem. It is not working from remote access. Login works, but the generate is not working. Whats the reason for this. Internally from another computer in LAN it works. Just not from the outside.

I USE NO PROXY!

Orion-zhen commented 9 months ago

same error with and without proxy. I have tried to unload all extensions, or rebuild a venv from scratch, but nothing worked.

once I click the generate bottom, it reports:

Applying attention optimization: Doggettx... done.
No Image data blocks found.
Model loaded in 4.3s (load weights from disk: 0.8s, create model: 0.2s, apply weights to model: 2.7s, load textual inversion embeddings: 0.3s, calculate empty prompt: 0.1s).
No Image data blocks found.
No Image data blocks found.
WARNING:  Invalid HTTP request received.
Loverboy3030 commented 3 months ago

guys, just try to ensure the address starts with http and not https.

scorpiomaj27 commented 2 months ago

guys, just try to ensure the address starts with http and not https.

Unfortunately that doesn't work, I've always used http.