Open mvsoom opened 1 year ago
When I add the file manually (copy it from this repo), the command runs succesfully, but it returns this error:
(.venv) marnix@hp:~/ART/CAMERA/code$ opengpt deploy stabilityai/stablelm-tuned-alpha-3b --precision fp16 --device_map balanced --cloud jina --replicas 1
[08/06/23 21:41:16] INFO Successfully validated flow config. flow.py:158
Proceeding to flow deployment...
⠴ Deploying /tmp/tmpl65rkej9 • Submitting... ━━━━━━━━━━╺━━━━━━━━━ 1/2 • 0:00:00[jcloud ] Successfully validated flow config. Proceeding to flow deployment...
[08/06/23 21:41:17] INFO Successfully submitted flow with ID flow.py:177
striking-elk-0cbc352599
⠧ Deploying /tmp/tmpl65rkej9 • Submitting... ━━━━━━━━━━╺━━━━━━━━━ 1/2 • 0:00:01[jcloud ] Successfully submitted flow with ID [bold][blue]striking-elk-0cbc352599[/blue][/bold]
INFO Check the Flow deployment logs: flow.py:720
https://cloud.jina.ai/ !
⠧ Deploying /tmp/tmpl65rkej9 • Submitting... ━━━━━━━━━━╺━━━━━━━━━ 1/2 • 0:00:01[jcloud ] Check the Flow deployment logs: [link=https://cloud.jina.ai/user/flows?action=detail&id=striking-elk-0cbc352599&tab=logs]https://cloud.jina.ai/[/link] !
**Unexpected phase: Failed reached at Starting for Flow ID striking-elk-0cbc352599**
Any idea how to solve this? I can see the Failed
Flow in https://cloud.jina.ai/user/flows, but the Logs
tab is empty.
I tried all the variations I could think of, including different resource tiers (G2, G3) and models and quantization.
open_gpt.__version__ == '0.0.13' (+ manually copied open_gpt/resources/flow.yml.jinja2 file from this repo)
jina.__version__ == '3.20.0'
jcloud.__version__ = '0.2.16'
Python 3.9.5
Ubuntu 20.04.6 LTS
I'm currently on the free plan.
I can confirm the same error (Unexpected phase: Failed reached at Starting for Flow ID
) happens after doing a fresh install using
git clone https://github.com/jina-ai/opengpt.git
pip install -e .
Thanks for feedback, how long did you get this error after you executed the command opengpt serve
?
About 10 minutes for every combination that I tried.
This might be some issues in JCloud since it takes too much time to convert the JCloud YAML file to k8s YAML. Unfortunately, there isn't too much we can do at this time, but I will keep you updated once we have some progress here. Sorry for the inconvenient. Will it be possible to deploy the model on your local machine?
Unfortunately not. Is it possible to upload and use it to JCloud without docker, i.e. with source code format?
Bump
I installed
pip install open_gpt_torch
today. Theopengpt serve stabilityai/stablelm-tuned-alpha-3b --precision fp16 --device_map balanced
command fails becauseopen_gpt/resources/flow.yml.jinja2
cannot be found. I think it might be missing from thepip
install from because of a missingMANIFEST.in
file?