Open asinger-narrativelinks opened 1 year ago
I'm seeing the same behavior as above.
Dec 20th 2023 and am still seeing this behavior.
I spent hours and many kWh training on local docs. I was shocked to find that the API does not reference them at all. This makes no sense at all. Why the disconnect between the local Win 11 Desktop GUI and the API? Trained on Mistral OpenOrca. Very happy with the GUI results and the 'Show references' option. Exactly what I need.
Please raise an issue for this. I don't see it as an 'enhancement' at all, but something that is broken and needs fixing.
Thanks.
seems like the same issue as #1745
Hopefully we can get this feature soon
I'm also facing this same issue and need. Would be really appreciative with this feature.
same thing for me i was happy to discover this issue but unfortunatly it doesn't actually exist... so sad
I'm not a programmer and am not sure how GitHub issues work.... Do the dev's for this project even know about this issue? How do we raise it? ie, help them see this issue? The local desktop with local docs works great, but for this one issue, that the local docs are not used when using the server/API.
This is an amazing project that does everything we need.... Its just so frustrating that the trained local docs are trapped on the desktop GUI and cant be used or accessed by others.
Do the dev's for this project even know about this issue?
Yes. But Nomic has other priorities right now. The community is free to contribute fixes in the form of PRs.
Is there any update on expected timeline for this request ?
Depends on what you're trying to do.
You cannot enable LocalDocs or select a collection from a client through the API.
You can, however, select a LocalDocs collection in the chat application itself for consequent requests. The wiki has now a description of how to enable a collection.
I would like to be able to leverage LocalDocs via API (since I really like the value it adds to chat application). Either by specifying the collection to be used during each API call or to somehow specify via chat application that all API calls should use specific LocalDocs collection(s).
... or to somehow specify via chat application that all API calls should use specific LocalDocs collection(s).
That's what I've described above, just follow the link. As long as the application is open, collections should stay active when using the API. Or do you mean as a separate setting somewhere?
Since we are talking about server usage, more persistent setting for server chat would be needed as current option requires user interaction through chat application on each restart. Can it be somehow part of Settings, where, similar to enabling of local server we could also check 'persist server chat LocalDocs' ?
System Info
Amazing project - thank you!
When I connect to the application using the API by calling the "openai.Completion.create", the answer is not based on local docs that were uploaded to the application. When I ask the application UI the question, I recieve different answer, based on the local docs and references to the context docs. Is there a way to get the same behaviour through the API?
Information
Related Components
Reproduction
Expected behavior
Similar behaviour of the UI and the API.