-
# OPEA Inference Microservices Integration for LangChain
This RFC proposes the integration of OPEA inference microservices (from GenAIComps) into LangChain [extensible to other frameworks], enabli…
-
Example can be found here: https://github.com/opea-project/GenAIExamples/tree/main/Translation
Documentation should follow this format: https://github.com/opea-project/docs/tree/main/examples/ChatQ…
-
### Feature Description
OPEA (Open Platform for Enterprise AI) is a newly introduced project by the Linux Foundation. You can find all the details at https://opea.dev/. It provides an open source pl…
-
Some of the docs have outdated dangling links .. clean up
-
https://opea-project.github.io/latest/getting-started/README.html
-
https://opea-project.github.io/latest/getting-started/README.html
-
### Priority
P3-Medium
### OS type
Ubuntu
### Hardware type
AI-PC (Please let us know in description)
### Running nodes
Single Node
### Description
As AI PC or OPEA developer, I want to deplo…
-
### Priority
P4-Low
### OS type
Ubuntu
### Hardware type
Xeon-GNR
### Running nodes
Single Node
### Description
Examples can involve many docker images that take up disk spa…
-
Looking for Chinese, Korean, Japanese, Spanish, German
Documentation in English can be found here: https://github.com/opea-project/docs/tree/main/examples/ChatQnA
When draft is complete please p…
-
OPEA is supporting more and more components (https://github.com/opea-project/GenAIComps), helm charts should be updated accordingly to provide more building blocks for AI application.
Helm Charts can…