cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.1k stars 1.42k forks source link

boucle inf on javascript call to Dalai #465

Open nbeny opened 1 year ago

nbeny commented 1 year ago

Hello,

I currently try to integrate the solution to an API.

I testing the Dalai javascript livrary for made it.

Here is my code:

const Dalai = require('dalai')

const test = () => {
  return new Promise((resolve, reject) => {
    let answer = ''
    new Dalai().request(
      {
        model: 'llama.7B',
        prompt:
          "listes 5 domaines ou ils y a le plus de chance de creer une boite avec l'IA ?",
      },
      (token) => {
        answer += token

        // Vérifier si le token est '<end>'
        if (token.includes('<end>')) {
          resolve(answer)
        }
      },
      (error) => {
        reject(error)
      }
    )
  })
}

;(async () => {
  try {
    const result = await test()
    console.log(result)
  } catch (error) {
    console.error("Une erreur s'est produite :", error)
  }
})()

This is fonctionnaly work I see the console.log(result) at the end. But the program never stop, I have to ctrl + C. So something run when the script should finish. Do you know why ?

This break the possibility to integrate with an API REST so...

Thanks !