Closed theP13RC3 closed 6 years ago
Mouth movements could also change for different "emotions," but I don't know how tough that would be.
Ha, I like the idea of "pouty Mycroft!". We just released the code for the https://github.com/MycroftAI/enclosure-mark1 if you want to experiment with different looks.
Hey, I know how to change the eyes with python. I could place a pull request with the code if that helps!
Here is some sample code for changing the eye color:
def eyes_color(self, r, g, b):
"""
Used to change color of eyes on mark1
Args:
r (int): 0-255, red value
g (int): 0-255, green value
b (int): 0-255, blue value
"""
mycroft_type = '"enclosure.eyes.color"'
mycroft_data = '{"r": %s, "g": %s, "b": %s}, ' \
'"context": null' % (r, g, b)
message = '{"type": ' + mycroft_type + \
', "data": ' + mycroft_data + '}'
print(message)
self._ws.send(message)
response = "Sent command to mycroft have eyes change color"
return response
Probably way too difficult and deep into core, but it would be hilarious/awesome if it modulated his diction, too. Perhaps he starts using sort of decorative and supplemental adjectives and subordinate clauses that coincided with emotion ("here is your stupid information" when mad, "Your order is placed, /I hope/." when sad, "Excellent! It is my pleasure to inform you that your order is submitted!"
I just thought of how to do this with skeledrew’s brain skill. You could make a command word, let’s say, Hey mycroft, get mad. And make it so whenever you said it, his eyes went red, he looked down, and said something.
Actually combined with another skill which could help to teach proper etiquette, together with a child's parents to the owner's family offspring when using the Mycroft instance.
So that the behaviour can be re-enforced with the appropriate consequential response. Getting appropriate emotional response back depending on the situation.
As would be even more hilarious/awesome if this actually where appropriate with the visual user interface of the Mark 2 unit (when installed on said unit). The face of the AI when you speak to it could change to show the emotion in the appropriate manner.
I built an "Alexa - How Am I" skill which actually did the opposite, when asked "How Am I", it used an external camera to check your mood and emotions, and would respond with some funny quip about the emotion perceived by the camera. This sort of integration required a library that performs something close to real time emotion detection, and actually keeps a raspi very busy (opencv is a monster). I also built a library for Hue which modified external hue lighting based on perceived emotion of the person in front of the camera (angry = red, jealous = green, sad = blue, etc...). It is hilarious to work in front of this camera and have lighting change throughout the day ... don't code angry :p . I haven't looked too deeply into the platform yet, but I guess the same could be built onto Mycroft as a skill, and change LEDs accordingly, or even affect external lighting when activated. I'm willing to open source the Hue code for this purpose, although the emotion detection needs a new implementation because I can't open source the emotion tracking library. Wondering if anyone has something decent open source that does emotion detection and doesn't hog 99% of the CPU.
Hi there @theP13RC3 thanks so much for raising this as a request. We've now moved all our our Skill Suggestions / requests to https://community.mycroft.ai/c/skill-suggestions Apologies for the extra work, but could I request that you write up this Skill Suggestion there? That way our community is able to vote on it. Kind regards, Kathy
Allows the user to change the current color scheme for Mycroft's LEDs, using commands like "Mycroft, get mad," (red eyes), "Mycroft, get sad," (green eyes), and "Mycroft, get glad," (default).