Closed innovateworld closed 8 months ago
website Scraping doesn't work ,via Docker
Primary server listening on port 3001 -- Working URL https://baijiahao.baidu.com/s?id=1786347651675278442&wfr=spider&for=pc -- getPageContent failed! Error: Could not find Chrome (ver. 119.0.6045.105). This can occur if either
npx puppeteer browsers install chrome
) orClosing this as stale, almost all of the items listed are either now their own issues, or have been resolved, or are no longer relevant with recent changes 👍
This might need to be split into multiple separate Issues. If so please let me know.
Issues Relating to My Goals (and possibly other people's) with AnythingLLM
There might not be a workaround for these but I have 2 primary goals with using AnythingLLM that I'm unable to use it for and I wanted to mention some additional separate concerns.
Goal 1: Use AnythingLLM to assist in generating Code using latest frameworks beyond LLM cutoff dates. Here are my problems:
A) Importing Certain Repositories from GitHub
B1) Non-Supported File Types
.md
files were uploaded but others saidFile does not exists in this diretory
. Other filetypes were not supported. Since these other filetypes aren't currently supported I might be able to write a script to copy each of the files text into a .md file or another format (since not all .md files worked) that will import. It would be nice if either AnythingLLM could support these formats or at least find a way to import them automatically to not need scripts. I understand my use case might be difficult to accommodate so I'm not mad at AnythingLLM.B2) Unable to easily select multiple docs in
My Documents
to delete several at once.My Documents
that were successfully able to upload but a whole lot more that were unable to. I want to delete from myCustom Documents
but not all of them. I can'tShift + Click
to select multiple and now have to manually click one at a time to delete (plus with a verification popup).Goal 2: Use the AnythingLLM API from other development tools to run my LLM queries programmatically with my own external system prompts that would override the AnythingLLM system prompt to interact with the LLM but still be able to use the embeddings in the VectorDB that AnythingLLM generated with my custom Documents in my Workspace.
http://localhost:3001/api/docs/
I see an interface that looks like litellm but I'm unsure if I can override the System Prompt associated with AnythingLLMSeparate Issues from Goals:
Running Locally with Docker is confusing
Issue 1: The docker-compose.yml doesn't seem to provide how to set local storage or any procedures for pulling the latest image without losing data. I attempted to use the docker-compose after pulling the repo but for some reason it failed.
Solution(s): Perhaps I made an error running the docker-compose.yml to run locally. I have docker experience for creating my own custom dev environments and running other projects with a docker-compose.yml from GitHub but just didn't have luck with this. If this isn't designed to run locally for some reason then this should be indicated somewhere.
Issue 2: After I failed the above I went to the AnythingLLM Documentation and unfortunately due to my lack of detailed Docker understanding, couldn't immediately figure out how to make the following work from
Run AnythingLLM locally
Solution(s): This also might be due to my lack of understanding of Docker but it would be nice to provide a more simplified example of how to run this using either Docker Desktop or a docker-compose.yml file with some set of details for people that might not be specialized in Docker.
What I did to get this working was:
docker-compose down
thendocker-compose pull
thendocker-compose up -d
and after verifying that I didn't lose any data, I then shared those instructions with others that were stuck.To be clear I like this solution overall and I run it currently with as a LocalLLm using Ollama + Litellm but .... As a separate concern I misunderstood that
http://host.docker.internal:xxxx
was supposed to be input from my Litellm port inside theAnythingLLM > LM Preference > Local AI Base URL
instead of trying to figure out how to set Docker to connect to that port in the docker-compose.yml file lol).Oh and since I'm on a roll.... I wanted to mention this:
B1.2) JSON files seemed to also not be supported! Pretty big deal!
Goal 1
. TheGoal 1
would ideally work withGoal 2
together but since neither of them work, even though I think you have an overall nice product, it just doesn't work for me and I can't recommend it and won't spend the time to help others anymore :(