Closed smalik2043 closed 6 months ago
I think you need to JSON format mode. ollama.setJSONFormat(false);
The output from the model includes characters that are not just a JSON object (the ```homeassistant
block markers). See output.gbnf for the actual grammar, but I don't think Ollama lets you provide a specific grammar.
@acon96 Thank you for the response. I have tried with your suggestion. Now I am getting the below response
{
output: " Dimmable LED Table Lamp, we've got you covered with a range of options to choose from.\n" +
'When it comes to the design of the lamp, there are several options available such as an antique bronze finish or a sleek silver finish. You can also select from different shades including clear, amber, and frosted glass. Additionally, the Lamp Two Light Dimmable LED Table Lamp is dimmable, which means you have control over how much light your space receives.\n' +
'The lamp uses energy-efficient LED technology, making it both eco-friendly and cost-effective in the long run. It also has a rechargeable battery that can last up to 10 hours before needing a charge. This feature makes it convenient for use during power outages or when you need portable lighting.\n' +
"Overall, if you're looking for a stylish and functional table lamp option, the Lamp Two Light Dimmable LED Table Lamp is definitely worth considering.",
stats: {
model: 'homellm:latest',
created_at: '2024-04-03T07:48:36.725986Z',
response: '',
done: true,
context: [
22529, 745, 418, 1301, 5761, 10315, 399, 12784, 494, 15862,
5270, 418, 1301, 13, 359, 1849, 1694, 368, 6107, 342,
247, 2491, 273, 4610, 281, 5206, 432, 15, 187, 3039,
352, 3249, 281, 253, 2216, 273, 253, 18067, 13, 627,
403, 2067, 4610, 2130, 824, 347, 271, 41450, 26247, 8416,
390, 247, 47115, 9711, 8416, 15, 1422, 476, 671, 3609,
432, 1027, 30553, 1690, 2590, 13, 717, 589, 13, 285,
8954, 30230, 5253, 15, 9157, 13, 253, 418, 1301, 5761,
10315, 399, 12784, 494, 15862, 5270, 418, 1301, 310, 3317,
36426, 13, 534, 2097, 368, 452, 1453, 689, 849, 1199,
... 100 more items
],
total_duration: 13678377002,
load_duration: 239816,
prompt_eval_duration: 71266000,
eval_count: 195,
eval_duration: 13606104000
}
}
I am trying to get the below response from the home-llm model using (acon96/Home-3B-v3-GGUF) as an example response suggested in this URL https://github.com/acon96/home-llm/blob/develop/README.md
You are 'Al', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed with the information provided only. Services: light.turn_off(), light.turn_on(brightness,rgb_color) Devices: light.f8116fe0-a2eb-11ed-bef4-c7c47e09aabb 'Lamp One Light' = on;80% light.db8076b0-be6f-11ed-a65f-f5afc3dc38ee 'Lamp Two Light' = on;80% light.cfc9ac30-ddd0-11ed-928c-a95e465b84db 'Lamp 3' = on;80% light.96ee3a50-dec2-11ed-928c-a95e465b84db 'Lamp 4' = on;80%
Can somebody review the code if I am going on the right track and how to achieve the below response.
turning on the Lamp Two Light for you now