matatonic / openedai-vision

An OpenAI API compatible API for chat with image input and questions about the images. aka Multimodal.
GNU Affero General Public License v3.0
204 stars 17 forks source link

what changed that MiniCPM-V-2.6 is suddenly a gated repo and restricted #25

Closed saket424 closed 2 weeks ago

saket424 commented 2 weeks ago

I had this downloaded to the cache and working for weeks and now i get this error What do i need to do to fix it? moondream2 works fine and does not appear to be gated

Cannot access gated repo for url https://huggingface.co/openbmb/MiniCPM-V-2_6/resolve/main/processor_config.json.
Access to model openbmb/MiniCPM-V-2_6 is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 406, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 113, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 185, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 82, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 187, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/openedai.py", line 126, in log_requests
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 163, in call_next
    raise app_exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 149, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 85, in __call__
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 715, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 735, in app
    await route.handle(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 76, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
    response = await f(request)
               ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 212, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/vision.py", line 97, in vision_chat_completions
    text = await vision_qna.chat_with_images(request)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/vision_qna.py", line 136, in chat_with_images
    resp = [r async for r in self.stream_chat_with_images(request)]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/vision_qna.py", line 136, in <listcomp>
    resp = [r async for r in self.stream_chat_with_images(request)]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/minicpm-v-2_6.py", line 82, in stream_chat_with_images
    answer = self.model.chat(
             ^^^^^^^^^^^^^^^^
  File "/app/hf_home/modules/transformers_modules/openbmb/MiniCPM-V-2_6/c13ade7d9008d590524921b46f614b00135d75ee/modeling_minicpmv.py", line 301, in chat
    self.processor = AutoProcessor.from_pretrained(self.config._name_or_path, trust_remote_code=True)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/auto/processing_auto.py", line 322, in from_pretrained
    return processor_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/processing_utils.py", line 916, in from_pretrained
    processor_dict, kwargs = cls.get_processor_dict(pretrained_model_name_or_path, **kwargs)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/processing_utils.py", line 610, in get_processor_dict
    resolved_processor_file = cached_file(
                              ^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/utils/hub.py", line 421, in cached_file
    raise EnvironmentError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/openbmb/MiniCPM-V-2_6.
401 Client Error. (Request ID: Root=1-673584e2-27729b997a93e904586765ea;81471dcc-c544-422f-b31f-2f584653f9c5)

Cannot access gated repo for url https://huggingface.co/openbmb/MiniCPM-V-2_6/resolve/main/processor_config.json.
Access to model openbmb/MiniCPM-V-2_6 is restricted. You must have access to it and be authenticated to access it. Please log in.
matatonic commented 2 weeks ago

Sadly, yes they did.

If you want to follow the official way, you can create a huggingface account, agree to share your contact info, then access the model. You can then get an API token from huggingface, save it in the vision.env like so:

HF_TOKEN=hf_XXXXXXXXXXXXXXXXXXXXXXXXXXXX

And it should work again.

Another option is to create a huggingface account and upload your cached copy of the model for yourself without the gating, I'm not sure if this complies with the license however so I can't advise.

Alternately, there are a few people who have re-uploaded the model without such restrictions, but I can't vouch for those models, and would advise if you do plan to use an alternate model you verify the checksums before use and perhaps even clone the repo to your own account first to ensure no future changes.

I would like future releases to stop relying on huggingface downloads and allow local models, but I haven't found the time to update the codebase in a little while.

saket424 commented 2 weeks ago

@matatonic Thanks for the response. including the HF_TOKEN and agreeing to the model rules fixed it Definitely, once the model has been downloaded and exists locally, we should be able to decouple with huggingface when feasible.