osi1880vr / prompt_quill

Apache License 2.0
195 stars 19 forks source link

unclear at best #6

Closed mr-lab closed 5 months ago

mr-lab commented 5 months ago

telemetry is enabled by default unclear installation instructions llama-cpp_windows llama_index_pq llmware_pq , like why so many ... i'm willing to convert this to SDWEBUI script , but there is so much unnecessary things the second i saw telemetry working i stopped the installation. i understand you want to gather more data but ... do let us know .

osi1880vr commented 5 months ago

What Telemetry you are talking about? I do not receive any data. So let me know what you think is enabled and I will try to turn it off

mr-lab commented 5 months ago

What Telemetry you are talking about? I do not receive any data. So let me know what you think is enabled and I will try to turn it off

image

mr-lab commented 5 months ago

i'm going with those lines : python -m venv venv venv\Scripts\activate python -m pip install -r requirements.txt python prompt_quill_ui_qdrant.py would that works as a local Gradio web page? currently downloading the models

osi1880vr commented 5 months ago

thats the qdrant server that seems to be sending something if it really does which I don't know, but good hint I will try to find out how to disable that

osi1880vr commented 5 months ago

you would somehow need to run qdrant the vector store

osi1880vr commented 5 months ago

If you dont mind youre welcome in my discord that makes thing much more easy to talk =)

XmYx commented 5 months ago

I'm goind to be on the way implement this for comfyui, please if possible, let's aim for using a shared codebase with minor differences that check wether we are in comfy auto or forge, but ultimately we should have a modelloader, qdrant sampler, and llm sampler. with enough modularity, the llm is arbitrary.

mr-lab commented 5 months ago

thats the qdrant server that seems to be sending something if it really does which I don't know, but good hint I will try to find out how to disable that

qdrant that's what making my head spin first time i hear about it , i was trying to convert the project to a python script extension for forge and Auto1111 we need to find a way to make it work ... you did some awesome job adding horde and civita generation , but there is scripts to do that in both auto111 and comfyui i will try , thanks again

osi1880vr commented 5 months ago

QDRANT__TELEMETRY_DISABLED = true as environment var will disable qdrant telemetry I add this now to all the scripts to be set by default https://qdrant.tech/documentation/guides/telemetry/

osi1880vr commented 5 months ago

I did disable the telemetry now for haystack and llama-index, llmware will follow as soon as I finished the other updates im doing there

osi1880vr commented 5 months ago

Telemetry is taken care for now so its closed