β οΈ Please check that this feature request hasn't been suggested before.
[X] I searched previous Issues didn't find any similar feature requests.
π Feature description
Open AIs ChatGPT is a fantastic tool for this application but is there any development towards using a locally hosted language model like Dolly or Vicuna?
βοΈ Solution
I'd have to look through the code more but a locally hosted server could act like an API to ease swapping out ChatGPT API. If the code uses OOP maybe another module for making requests would be enough and another class for local models, or that's on the user to setup with templates responses required by loopGPT?
β Alternatives
No response
π Additional Context
Generally I'd like to use this to aid research for commercial projects that are data sensitive.
Acknowledgements
[X] My issue title is concise, descriptive, and in title casing.
[X] I have searched the existing issues to make sure this feature has not been requested yet.
[X] I have provided enough information for the maintainers to understand and evaluate this request.
β οΈ Please check that this feature request hasn't been suggested before.
π Feature description
Open AIs ChatGPT is a fantastic tool for this application but is there any development towards using a locally hosted language model like Dolly or Vicuna?
βοΈ Solution
I'd have to look through the code more but a locally hosted server could act like an API to ease swapping out ChatGPT API. If the code uses OOP maybe another module for making requests would be enough and another class for local models, or that's on the user to setup with templates responses required by loopGPT?
β Alternatives
No response
π Additional Context
Generally I'd like to use this to aid research for commercial projects that are data sensitive.
Acknowledgements