Since LLMs offer a nice interface to interact with a corpus of data, a data sink node that has a GPU or spare CPU compute should be able to offer an LLM type interface on top of data:
"give me the list of all lights that are on" : summarize the list of lights that are on right now
"what was my average power consumption yesterday": compute the average power consumption
"has anyone visited the house in the last four hours": look up the logs from the last four hours of visitors, and summarize.
Since LLMs offer a nice interface to interact with a corpus of data, a data sink node that has a GPU or spare CPU compute should be able to offer an LLM type interface on top of data:
"give me the list of all lights that are on" : summarize the list of lights that are on right now "what was my average power consumption yesterday": compute the average power consumption "has anyone visited the house in the last four hours": look up the logs from the last four hours of visitors, and summarize.