BlafKing / sd-civitai-browser-plus

Extension to access CivitAI via WebUI: download, delete, scan for updates, list installed models, assign tags, and boost downloads with multi-threading.
GNU Affero General Public License v3.0
309 stars 53 forks source link

[Bug]: Every file processed #281

Closed silveroxides closed 4 months ago

silveroxides commented 4 months ago

Describe the bug.

Updater processes each individual file or all files in category every time update preview is done which I do not understand. Also that it provesses each file one by one makes the time it takes way over the top. related issue that was never properly solved. #265 models in console logs are over one month old and has been scanned surely about 20 times over by now

Steps to reproduce the problem.

  1. Go to Update models
  2. Press any category with many model files
  3. Press update previews
  4. Wait forever while it processes every single safetensors one by one

Expected behavior

Well I would have expected that the extension built an index of the models scanned because I was informed this was resolved at april 7th commit but it clearly wasnt.

System info

Extension version: 2793bcdf Webui version: v1.8.0   python: 3.10.9   torch: 2.1.1+cu118   xformers: N/A   gradio: 3.41.2  

Console logs

CivitAI Browser+: model: "PonyDiff_experimental.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_02.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_03.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_04.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_05.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_06.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_07.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_08.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_09.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_64dim.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_95TE25U.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_95TE50U.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_99TE25U.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_99TE50U.safetensors" not found on CivitAI servers.
CivitAI Browser+: model: "PonyDiff_experimental_PDEXP.safetensors" not found on CivitAI servers.

Additional information

No response

BlafKing commented 4 months ago

The extension detects if a model is not in the CivitAI servers by checking the model ID of the associated .json file.

For example:

{
    "sha256": "Model_hash",
    "modelId": "Model not found"
}

If the modelId is set to "Model not found" then it will skip the model (by default it also logs the "not found on CivitAI servers" message which is toggleable in the settings) and it then does not process the file in an API request.

If a model has not been processed yet, then as long as the sha256 generation is turned on, it then generates a sha256 hash and sends a request to the CivitAI API, then when it receives a 404 return, indicating the model does not exist, then the CivitAI API returns "error": "Model not found", the code then grabs the return from the error and sets it as the model ID, which then in the future will be read and no longer processed.

The only change I can think of is to manually set a model ID to "Model not found" instead of reading the returned error, but the current code already stops processing files if they were previously not found, like I wrote in the previous bug report.

Additionally you can check the associated .json files of the models to check their Model IDs in case they are getting individual API requests for each file, I suspect this to be the issue, as the current code should correctly handle skipping models.

here is the relevant code in case you're interested along with some comments explaining the steps: first the get_models function is ran for every file.

def get_models(file_path, gen_hash=None):
    # First we try to read the json file and get an exesting model ID, if present

    if not modelId or not modelVersionId or not sha256: # If no model ID, version ID, or sha256 is found, try to generate a sha256 hash
        if not sha256 and gen_hash:
            sha256 = gen_sha256(file_path)

        if sha256:
            by_hash = f"https://civitai.com/api/v1/model-versions/by-hash/{sha256}"
        else:
            return modelId if modelId else None # If no sha256 hash could be created, try to use the current model ID, if available, else none.

    proxies, ssl = _api.get_proxies()
    try:
        if not modelId or not modelVersionId:
            response = requests.get(by_hash, timeout=(60,30), proxies=proxies, verify=ssl) # Send an API request with the sha256 hash
            if response.status_code == 200:
                api_response = response.json()
                if 'error' in api_response:
                    print(f"\"{file_path}\": {api_response['error']}")
                    return None
                else:
                    modelId = api_response.get("modelId", "")
                    modelVersionId = api_response.get("id", "")
            elif response.status_code == 503:
                return "offline"
            elif response.status_code == 404: # If no model is returned, aka a 404 error, then set the model to the returned error, being "Model not found"
                api_response = response.json()
                modelId = api_response.get("error", "")

                # Later on the model ID and other found info are saved to the associated .json file.
model_id = get_models(file_path, gen_hash) # Get the model ID from the get_models function
    if model_id == "offline":
        print("The CivitAI servers did not respond, unable to retrieve Model ID")
    elif model_id == "Model not found": # Check if the models ID = "Model not found" if so, skip the model and do not add it to the list of valid models.
        if not_found_print:
            print(f"model: \"{file_name}\" not found on CivitAI servers.")

As far as I can tell there is currently not an issue with the way that the code handles the issue you're experiencing, so I'll close this bug report as it's not something that I can fix, you're free to ask other questions about why this might be happening.