Open IRONICBo opened 1 year ago
lgtm
@IRONICBo
Front-end, back-end and AI work needs to be done in the new repository.
Nice design. I wanted to ask you: What is the idea behind the repository design? How many repositories do we need?
I think several parts can be placed in a complete project. They can be used on demand during deployment, for example, if you don't need local AI deployment, you can leave this part of the image unstarted.
-- openim-custom-service | forend | backend | bot | xxx
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 7 days.
[RFC OpenIMSDK/Open-IM-Server#2] OpenIM Custom Service Proposal
Meta
Topics
[RFC OpenIMSDK/Open-IM-Server#2] OpenIM Custom Service Proposal
Meta
📇Topics
Summary
Definitions
Motivation
What it is
How it Works
Migration
Drawbacks
Alternatives
Prior Art
Summary
OpenIM-based open source online customer service system: Online customer service system is one of the important means for many enterprises to provide services and support to their customers. This project intends to develop an online customer service system based on the open source OpenIM instant messaging system, and introduce a large model to build a local knowledge base Q&A Bot. The system will provide online customer service, after-the-fact callback, customer management and other functions to help enterprises improve customer service quality and efficiency.
Definitions
Motivation
What it is
This provides a high level overview of the feature.
How it Works
The project includes login, user management, session management, platform access management, and local knowledge base management.
Login Module
User module
User Roles
Administrator role
Customer service roles
User role
Sessions module
Historical session lookup
Active session query
Session Manager
Platform access module
H5
Slack
Local knowledge base module
LangChain & ChatGLM model deployment
Migration
Front-end, back-end and AI work needs to be done in the new repository.
Drawbacks
If you need to use a large language model with a local knowledge base, you may need a certain GPU footprint.
Alternatives
If you can't use the local AI model, try turning off this feature or accessing other service providers such as ChatGPT, Claude, etc.
Prior Art
There is now work by others that can be used as a reference for implementation.
Unresolved Questions