Closed simonw closed 6 months ago
Hey @simonw ! Whoops, thanks for pointing this out. Two things to figure out before renaming:
localllm
is impossible to type correctly repeatedly imo XD XD XD (the best new name I've thought of so far is lol
but that's a) probably taken and b) wow not descriptive XD)is there any way we can do this in a backward compatible way so anyone who is already using it (or just reading the blog post talking about it) can use the previous command without trouble
Inside the Cloud Workstations image I'm thinking we could provide llm
as a symlink to whatever the new name is; and it seems like it might be easy enough to update the blog post.
Just need to think of a new name now... 🤔 🤔 🤔
localai / localgenai / localmodel ?
On Thu, Feb 8, 2024, 1:48 PM Christie Warwick (Wilson) < @.***> wrote:
is there any way we can do this in a backward compatible way so anyone who is already using it (or just reading the blog post talking about it) can use the previous command without trouble
Inside the Cloud Workstations image I'm thinking we could provide llm as a symlink to whatever the new name is; and it seems like it might be easy enough to update the blog post.
Just need to think of a new name now... 🤔 🤔 🤔
— Reply to this email directly, view it on GitHub https://github.com/GoogleCloudPlatform/localllm/issues/8#issuecomment-1934736382, or unsubscribe https://github.com/notifications/unsubscribe-auth/AESST3VZL6IGSOOG2OYQNBLYSUMX3AVCNFSM6AAAAABC6ZN3ESVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMZUG4ZTMMZYGI . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Thanks @gfsmfk !
Did some brainstorming with some other folks on my team and here are all the options we've come up with:
From all these options I'm thinking localllm
is the most clear and least confusing (and removes some weird inconsistency b/w the repo being called localllm
and the command something else), but I'll want to make sure that locallm
works too since there are going to be a lot of typos.
if the tool is going to be dependent on platform or cloud vendor specific aspects, perhaps it wouldn't hurt to keep the naming along those lines as well to prevent misleading users? Just now came across this tool, but the naming and its capabilities made it confusing to understand whether this could be used in conjunction with other FLOSS tools or not.
i.e Workstation seems to be specific to Google Cloud or Compute Engine, and Huggingface isn't the only reference for models (I understand Google has a partnership with them though).
Good point @RyzeNGrind ! There's nothing cloud specific in the tool itself - the motivation for making it initially was to have something in Cloud Workstations that would easily run local models, and the image building steps in the README build a Cloud Worktations image, but the command line tool doesn't have anything specific to this.
The Huggingface integration is pretty specific, though 🤔 (and some of the hackier parts are actually specific to TheBloke 🤠). So I'm almost tempted to use one of the hf
names but I've talked myself out of it because if it's useful, there's no reason this couldn't be extended to support other model registries - pull
run
kill
ps
could work equally well for those too - whether there's enough interest to actually do that is another question though!
TL;DR I'm still leaning toward localllm
as the best/least-worst option atm.
Quick update: ended up going with local-llm
instead of localllm
since it is SO HARD to type localllm
properly consistently.
https://github.com/GoogleCloudPlatform/localllm/blob/d27376fa3f6e6bcfcd3ae9c9c8f61e163a3c1899/llm-tool/setup.py#L19-L23
And:
https://github.com/GoogleCloudPlatform/localllm/blob/d27376fa3f6e6bcfcd3ae9c9c8f61e163a3c1899/llm-tool/setup.py#L34-L38
I'm the author of https://pypi.org/project/llm/ which installs a package called
llm
and a CLI tool calledllm
as well. Myllm
tool is similar to localllm in as much as my tool lets you execute prompts in the terminal, against both remote models and local models (usingllama-cpp-python
).As it stands using my tool and this tool in the same environment won't work, because of the namespace clash.
If you pick a different name for this you can also publish it to PyPI, which would make for a more convenient installation experience for end users.
https://llm.datasette.io/ has more about how my tool works and what it does.