rh-aiservices-bu / llm-on-openshift

Resources, demos, recipes,... to work with LLMs on OpenShift with OpenShift AI or Open Data Hub.
Apache License 2.0
74 stars 71 forks source link

Minor milvus fixes #45

Closed anishasthana closed 4 months ago

anishasthana commented 4 months ago

cc @guimou

Some things I didn't include in the PR that I needed to do (but wasn't sure if they would fit):

1) I updated configmap-collections.yaml to have just one element with a name matching the collection that was created by the milvus-ingest notebook. 2) Related, the variable in line 39 is hardcoded. It's probably better to parametrize it from the deployment yaml itself. While I was hacking around I just changed it to demo_collection (as per the notebook) 3) An image needs to be pushed up to the official quay repo. 4) I didn't realise that the inference server url (line 55 in the deployment.yaml) needs to include the protocol as well as the port. Not a bit deal, but probably worth calling out in the docs. 5) I didn't get it to work with https/ssl. I admittedly didn't spend a ton of time on it.

Overall, thank you very much for this! I've learned a lot :-)

guimou commented 4 months ago

All the corrections were being worked on in another branch and are now applied.