ohmplatform / FreedomGPT

This codebase is for a React and Electron-based app that executes the FreedomGPT LLM locally (offline and private) on Mac and Windows using a chat-based interface
http://www.freedomgpt.com
GNU General Public License v3.0
2.59k stars 353 forks source link

[suggestion] Make Freedom GPT executable on Colab #14

Open Juan-Cruz-Iturrioz opened 1 year ago

Juan-Cruz-Iturrioz commented 1 year ago

I would like to suggest an improvement to Freedom GPT. It would be great if the model could be executed on Colab, This would allow for easy access and execution of the model on a cloud-based environment, without the need for local installations.

Please consider this suggestion and let me know if there are any plans to implement it in the future. Thank you for your hard work on Freedom GPT!

cooperdk commented 1 year ago

Why... why? Also, eh, WHY.

AntonioCiolino commented 1 year ago

Faster CPU/GPU for trying it out before buying a decent ML machine?

cooperdk commented 1 year ago

I am pretty sure you need to write in English here.

cooperdk commented 1 year ago

Faster CPU/GPU for trying it out before buying a decent ML machine?

Yeah, no. The binary behind (alpaca) does not support GPU at all so your own computer would be faster unless you use a (very) old laptop or have very little memory (this will require af little as 4-6 GB to run apart from what you OS uses).

AntonioCiolino commented 1 year ago

Heh- I have a pair of older machines that run the app but I don’t get responses back after ten minutes. One windows one Mac. dual core 2.7ghz 16mb ram.

ItsPi3141 commented 1 year ago

why? if you want to use colab, then just use oobabooga text generation webbui or the llama.cpp CLI.