acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
631 stars 64 forks source link

Response is not coming as expected using home-llm model #100

Closed smalik2043 closed 6 months ago

smalik2043 commented 6 months ago

I am trying to get the below response from the home-llm model using (acon96/Home-3B-v3-GGUF) as an example response suggested in this URL https://github.com/acon96/home-llm/blob/develop/README.md

turning on the Lamp Two Light for you now
```homeassistant
{ "service": "light.turn_on", "target_device": "light.db8076b0-be6f-11ed-a65f-f5afc3dc38ee" }

But for some reason I am not getting the exact response. I am using ollama-node library to configure the system prompt.
Below is my code

`import { Injectable } from '@nestjs/common';
import { Ollama } from 'ollama-node';
//import { Ollama } from 'ollama';
import { User } from 'user/user.entity';
import { UserService } from 'user/user.service';
@Injectable()
export class OllamaService {
  constructor(private readonly userService: UserService) {}

  public async responseFromOllama(data: any) {
    try {
      const responseFromOllama: any[] = [];
      const { userId, message } = data;
      let systemPromptName;
      systemPromptName = data.systemPromptName || 'Al';

      const user: User | undefined = await this.userService.findOne(userId);
      let systemPrompt = this.generateSystemPrompt(user?.settings?.devices, systemPromptName);

      console.log(systemPrompt);
      const ollama = new Ollama('host.docker.internal');
      await ollama.setModel('homellm:latest');
      ollama.setSystemPrompt(systemPrompt);
      ollama.setJSONFormat(true);
      const response = await ollama.generate(message);
      return response.output;
    } catch (e) {
      console.log(e);
    }
  }

  public generateSystemPrompt(devices: any, systemPromptName: string): string {
    let systemPrompt = `You are '${systemPromptName}', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed with the information provided only.\n`;
    systemPrompt += `Services: light.turn_off(), light.turn_on(brightness,rgb_color)\n`;
    systemPrompt += `Devices:\n`;
    devices.forEach((device: any) => {
      //let deviceId = device.cloudDeviceId.replace(/-/g, '_');
      let deviceInfo = `light.${device.cloudDeviceId} '${device.name}' = on;80%`;
      systemPrompt += `${deviceInfo}\n`;
    });

    return systemPrompt;
  }
}`

The generate system prompt generates the output as below for the system prompt

You are 'Al', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed with the information provided only. Services: light.turn_off(), light.turn_on(brightness,rgb_color) Devices: light.f8116fe0-a2eb-11ed-bef4-c7c47e09aabb 'Lamp One Light' = on;80% light.db8076b0-be6f-11ed-a65f-f5afc3dc38ee 'Lamp Two Light' = on;80% light.cfc9ac30-ddd0-11ed-928c-a95e465b84db 'Lamp 3' = on;80% light.96ee3a50-dec2-11ed-928c-a95e465b84db 'Lamp 4' = on;80%


Sometimes I get an empty response, sometimes with some random text. Can somebody review the code if I am going on the right track. 

The response I am getting is
`{"message":"{ }\n\n \n \n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n"}`

Can somebody review the code if I am going on the right track and how to achieve the below response.

turning on the Lamp Two Light for you now

{ "service": "light.turn_on", "target_device": "light.db8076b0-be6f-11ed-a65f-f5afc3dc38ee" }
acon96 commented 6 months ago

I think you need to JSON format mode. ollama.setJSONFormat(false);

The output from the model includes characters that are not just a JSON object (the ```homeassistant block markers). See output.gbnf for the actual grammar, but I don't think Ollama lets you provide a specific grammar.

smalik2043 commented 6 months ago

@acon96 Thank you for the response. I have tried with your suggestion. Now I am getting the below response

{
       output: " Dimmable LED Table Lamp, we've got you covered with a range of options to choose from.\n" +
     'When it comes to the design of the lamp, there are several options available such as an antique bronze finish or a sleek silver finish. You can also select from different shades including clear, amber, and frosted glass. Additionally, the Lamp Two Light Dimmable LED Table Lamp is dimmable, which means you have control over how much light your space receives.\n' +
     'The lamp uses energy-efficient LED technology, making it both eco-friendly and cost-effective in the long run. It also has a rechargeable battery that can last up to 10 hours before needing a charge. This feature makes it convenient for use during power outages or when you need portable lighting.\n' +
     "Overall, if you're looking for a stylish and functional table lamp option, the Lamp Two Light Dimmable LED Table Lamp is definitely worth considering.",
   stats: {
         model: 'homellm:latest',
     created_at: '2024-04-03T07:48:36.725986Z',
     response: '',
     done: true,
     context: [
           22529,   745,   418, 1301,  5761, 10315,  399, 12784,   494, 15862,
            5270,   418,  1301,   13,   359,  1849, 1694,   368,  6107,   342,
             247,  2491,   273, 4610,   281,  5206,  432,    15,   187,  3039,
             352,  3249,   281,  253,  2216,   273,  253, 18067,    13,   627,
             403,  2067,  4610, 2130,   824,   347,  271, 41450, 26247,  8416,
             390,   247, 47115, 9711,  8416,    15, 1422,   476,   671,  3609,
             432,  1027, 30553, 1690,  2590,    13,  717,   589,    13,   285,
            8954, 30230,  5253,   15,  9157,    13,  253,   418,  1301,  5761,
           10315,   399, 12784,  494, 15862,  5270,  418,  1301,   310,  3317,
           36426,    13,   534, 2097,   368,   452, 1453,   689,   849,  1199,
           ... 100 more items
     ],
     total_duration: 13678377002,
     load_duration: 239816,
     prompt_eval_duration: 71266000,
     eval_count: 195,
     eval_duration: 13606104000
   }
 }