itsPreto / baby-code

100% Private & Simple. OSS 🐍 Code Interpreter for LLMs πŸ¦™
https://github.com/itsPreto/baby-code
34 stars 7 forks source link

add --recurse-submodules for installation section #10

Closed Jipok closed 1 year ago

Jipok commented 1 year ago

Although in my opinion submodules is not very convenient (for you), it is still preferable than what was before.

itsPreto commented 1 year ago

@Jipok Convenience is a tradeoff I'm okay with for the time being-- tho I have plans to eventually allow for custom backends as opposed to being reliant on llama.cpp. If you have any ideas/suggestions on how to simplify the whole [or parts of the setup I'd love to hear them. I'll be creating a discord space at some point this week.

Jipok commented 1 year ago

If you have any ideas/suggestions on how to simplify the whole [or parts of the setup I'd love to hear them.

Jipok commented 1 year ago

In Model Download section: The 7B Llama-2 based model TheBloke/WizardCoder-Python-13B-V1.0-GGUF is a model fine-tuned by a kind redditor

Some very strange sentence with several logical errors. It seems you had a link to another model before.

Jipok commented 1 year ago

ΠΈΠ·ΠΎΠ±Ρ€Π°ΠΆΠ΅Π½ΠΈΠ΅ This funny

Jipok commented 1 year ago

The first thing the user sees after launch is: ΠΈΠ·ΠΎΠ±Ρ€Π°ΠΆΠ΅Π½ΠΈΠ΅ I think many, like me, will try to open http://127.0.0.1:8081 and get: ΠΈΠ·ΠΎΠ±Ρ€Π°ΠΆΠ΅Π½ΠΈΠ΅ Make it obvious which link to open, preferably hide the info from llama.cpp Perhaps you should automatically open the desired link in the browser. Although I didn’t like it before, many projects do this and I’m used to it.

Jipok commented 1 year ago
itsPreto commented 1 year ago

I addressed most of the issues-- I'll be merging the PR so feel free to open an issue for the outstanding changes. Thanks a bunch!