Powerkrieger / NobbyGPT

Reimplementation of nanoGPT for educational purposes
0 stars 0 forks source link

1. Setup german LLM Fork (Baseline) #1

Closed Kuckuck44 closed 9 months ago

Kuckuck44 commented 9 months ago
Kuckuck44 commented 9 months ago

How to import own models:

  1. Download and configure GPT4All with the help of the installer
  2. Open Settings → Navigate to "Application"
  3. Copy the path behind "Download Path" or customize it
  4. Copy the downloaded model into this path to use it in the desktop-client

Results:

Kuckuck44 commented 8 months ago

For further testing I will try llama-2-13b-german-assistant-v4.Q4_K_M.gguf, as it is flagged as "medium, balanced quality - recommended".

By the way - "TheBloke" only provided the "inferencable" model formats. The underlaying model is fine-tuned by Florian Zimmermeister.

Kuckuck44 commented 8 months ago

To generate the first test-data, i will use the Prompt "Bitte generiere einen langen Aufsatz, basierend auf dem folgenden Anfang: "Hallo, mein Name ist"." The Max-Token paramter for the generated response has ben manually modified to 131.072, instead of the default 4096.

I will create a .txt file with at least 5.000 Words from different responses to that input...anything else will last forever :)

Kuckuck44 commented 7 months ago

Just to clarify: the pretty silly modification of the "Max-Token" parameter was just in regard to some testing of GPT4All, because the used model behaved strangely at first.