issues
search
andrewnguonly
/
Lumos
A RAG LLM co-pilot for browsing the web, powered by local LLMs
MIT License
1.39k
stars
103
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Create docs page for Firefox install
#44
andrewnguonly
closed
9 months ago
0
Create LICENSE
#43
andrewnguonly
closed
9 months ago
0
Refactor RAG workflow
#42
andrewnguonly
closed
9 months ago
0
Bring your own API key
#41
andrewnguonly
closed
5 months ago
0
Add functionality to parse highlighted text and pass it to prompt
#40
andrewnguonly
closed
9 months ago
0
Add LICENSE
#39
ahmetson
closed
9 months ago
0
feat: Lumos -> lmohc - close enough ? static id for secure origin
#38
sublimator
opened
9 months ago
4
Create documentation/tutorial showing how to inspect webpage to select content for parsing
#37
andrewnguonly
closed
9 months ago
0
Document steps to install on Firefox
#36
andrewnguonly
closed
9 months ago
2
Investigate possible functionality for exporting LLM results, saving results, or searching historical results
#35
andrewnguonly
closed
8 months ago
1
Create embedded options page
#34
andrewnguonly
closed
9 months ago
0
Focus text field after token streaming is complete
#33
andrewnguonly
closed
9 months ago
0
Remove web-llm config from webpack config
#32
andrewnguonly
closed
9 months ago
0
Inconsistent spacing between avatar and message bubble
#31
andrewnguonly
closed
9 months ago
0
Focus on text input after token streaming is done
#30
andrewnguonly
closed
9 months ago
0
Remove `@mlc-ai/web-llm` webpack configuration
#29
andrewnguonly
closed
9 months ago
0
Implement chat UI
#28
andrewnguonly
closed
9 months ago
0
Support multi-modal LLM and media input
#27
andrewnguonly
closed
9 months ago
0
Implement UI to persist chat history
#26
andrewnguonly
closed
9 months ago
1
Implement response streaming
#25
andrewnguonly
closed
9 months ago
0
Move content config to user configuration, create UI
#24
andrewnguonly
closed
9 months ago
1
Move `react-scripts` to `devDependencies`
#23
andrewnguonly
closed
9 months ago
0
Implement token streaming for responses
#22
andrewnguonly
closed
9 months ago
0
Implement custom chunking configs
#21
andrewnguonly
closed
10 months ago
0
Update README
#20
andrewnguonly
closed
10 months ago
0
Update README with instructions for configuring custom parsing
#19
andrewnguonly
closed
10 months ago
0
Support URL patterns for custom content config
#18
andrewnguonly
closed
8 months ago
0
Implement custom content chunking for domains
#17
andrewnguonly
closed
10 months ago
0
Implement custom content parsing for domains
#16
andrewnguonly
closed
10 months ago
0
Add remote parsing code
#15
andrewnguonly
closed
10 months ago
0
Update README with instructions to run Ollama API from Docker container
#14
andrewnguonly
closed
10 months ago
0
Ollama started service successfully but browser did not respond
#13
su-zelong
closed
10 months ago
6
Update `README` to include instructions for pulling `llama2` model and changing models
#12
andrewnguonly
closed
11 months ago
0
Getting HTTP 400 errors on /api/embeddings
#11
vicendominguez
closed
11 months ago
7
implementing pieces ts sdk for local llms
#10
shivscaler
closed
9 months ago
12
Record video/gif of Chrome extension
#9
andrewnguonly
closed
11 months ago
0
Save text field state after closing the extension popup
#8
andrewnguonly
closed
11 months ago
0
Increase max tokens size for Web LLM
#7
andrewnguonly
closed
11 months ago
0
Fix bug with parsing HTML and CSS content
#6
andrewnguonly
closed
10 months ago
1
Update README with Use Cases section
#5
andrewnguonly
closed
11 months ago
0
Model Web LLM ChatRestModule as LangChain SimpleChatModel
#4
andrewnguonly
closed
11 months ago
0
Update loading bar color and text field outline color to yellow
#3
andrewnguonly
closed
11 months ago
0
Add listener for "Enter" key to submit prompt
#2
andrewnguonly
closed
11 months ago
0
Automatically scroll to bottom of text field as completion is updated
#1
andrewnguonly
closed
11 months ago
0
Previous