winstxnhdw / nllb-api

A performant high-throughput CPU-based API for Meta's No Language Left Behind (NLLB) using CTranslate2, hosted on Hugging Face Spaces.
https://huggingface.co/spaces/winstxnhdw/nllb-api
96 stars 18 forks source link

Add docker image with model #43

Closed niksedk closed 1 year ago

niksedk commented 1 year ago

Nice work ❤️

Would it be possible to add the model to the docker image? I've already downloaded the 1.4 GB model three times today...

winstxnhdw commented 1 year ago

Doing that would increase the build times of my CI tremendously. You can cache the model with a persistent volume instead.

docker run --rm -e SERVER_PORT=5000 -e APP_PORT=7860 -p 7860:7860 -v ./cache:/home/user/.cache nllb-api

EDIT:

I never knew that anyone would want to self-host this project. I've added a section on self-hosting in the README. It will explain how you can optimise the API for your use case(s).

niksedk commented 1 year ago

Thanks for the info about caching :)

I cannot run the docker image any more:

docker run --rm -e SERVER_PORT=5000 -e APP_PORT=7860 -p 7860:7860 ghcr.io/winstxnhdw/nllb-api:main

2023-09-30 09:34:21,552 INFO supervisord started with pid 1
2023-09-30 09:34:22,555 INFO spawned: 'server' with pid 7
2023-09-30 09:34:22,557 INFO spawned: 'caddy' with pid 8
{"level":"info","ts":1696066462.5948884,"msg":"using provided configuration","config_file":"Caddyfile","config_adapter":"caddyfile"}
{"level":"info","ts":1696066462.5971742,"logger":"admin","msg":"admin endpoint started","address":"localhost:2019","enforce_origin":false,"origins":["//localhost:2019","//[::1]:2019","//127.0.0.1:2019"]}
{"level":"info","ts":1696066462.597516,"logger":"tls.cache.maintenance","msg":"started background certificate maintenance","cache":"0xc000662e80"}
{"level":"info","ts":1696066462.5975373,"logger":"http.log","msg":"server running","name":"srv0","protocols":["h1","h2","h3"]}
{"level":"info","ts":1696066462.597588,"logger":"tls","msg":"cleaning storage unit","description":"FileStorage:/home/user/.local/share/caddy"}
{"level":"info","ts":1696066462.5976112,"logger":"tls","msg":"finished cleaning storage units"}
{"level":"info","ts":1696066462.5978942,"msg":"autosaved config (load with --resume flag)","file":"/home/user/.config/caddy/autosave.json"}
{"level":"info","ts":1696066462.5979033,"msg":"serving initial configuration"}
Traceback (most recent call last):
  File "/home/user/app/main.py", line 15, in <module>
    main()
  File "/home/user/app/main.py", line 12, in main
    run(Config())
        ^^^^^^^^
  File "/home/user/app/server/config/__init__.py", line 17, in __init__
    if (port := get_config('SERVER_PORT', int, InvalidPortError, default_port)) == default_port:
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/app/server/config/get_config.py", line 30, in get_config
    raise exception
server.config.exceptions.invalid_port.InvalidPortError
2023-09-30 09:34:23,348 WARN exited: server (exit status 1; not expected)
2023-09-30 09:34:24,352 INFO spawned: 'server' with pid 21
Traceback (most recent call last):
  File "/home/user/app/main.py", line 15, in <module>
    main()
  File "/home/user/app/main.py", line 12, in main
    run(Config())
        ^^^^^^^^
  File "/home/user/app/server/config/__init__.py", line 17, in __init__
    if (port := get_config('SERVER_PORT', int, InvalidPortError, default_port)) == default_port:
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/app/server/config/get_config.py", line 30, in get_config
    raise exception
server.config.exceptions.invalid_port.InvalidPortError
2023-09-30 09:34:25,140 WARN exited: server (exit status 1; not expected)
2023-09-30 09:34:27,145 INFO spawned: 'server' with pid 22
Traceback (most recent call last):
  File "/home/user/app/main.py", line 15, in <module>
    main()
  File "/home/user/app/main.py", line 12, in main
    run(Config())
        ^^^^^^^^
  File "/home/user/app/server/config/__init__.py", line 17, in __init__
    if (port := get_config('SERVER_PORT', int, InvalidPortError, default_port)) == default_port:
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/app/server/config/get_config.py", line 30, in get_config
    raise exception
server.config.exceptions.invalid_port.InvalidPortError
2023-09-30 09:34:27,973 WARN exited: server (exit status 1; not expected)
2023-09-30 09:34:30,979 INFO spawned: 'server' with pid 23
Traceback (most recent call last):
  File "/home/user/app/main.py", line 15, in <module>
    main()
  File "/home/user/app/main.py", line 12, in main
    run(Config())
        ^^^^^^^^
  File "/home/user/app/server/config/__init__.py", line 17, in __init__
    if (port := get_config('SERVER_PORT', int, InvalidPortError, default_port)) == default_port:
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/app/server/config/get_config.py", line 30, in get_config
    raise exception
server.config.exceptions.invalid_port.InvalidPortError
2023-09-30 09:34:31,797 WARN exited: server (exit status 1; not expected)
winstxnhdw commented 1 year ago

Oops, so sorry for this. My local container cached an older version of the image which didn't catch this issue. I have fixed it now. Make sure you delete the existing image so you can pull the new one.

niksedk commented 1 year ago

Thanks! It's working fine now with cashed model :)