MyRobotLab / myrobotlab

Open Source Java Framework for Robotics and Creative Machine Control
Apache License 2.0
226 stars 108 forks source link

Creating new servo orchestrator service/gui #228

Open Alexinator40 opened 6 years ago

Alexinator40 commented 6 years ago

With the development of an advanced inmoov head in the works( created by myself), there has become a need for a generic service or gui to both simultaneously control multiple servos and to chain small poses/gestures into much more intricate gestures. This will also provide robot builders, new and old, an easy way to test servos and parts of their robots.

supertick commented 6 years ago

references - https://github.com/MyRobotLab/inmoov/issues/171

supertick commented 6 years ago

@hairygael put this relevant ticket in with a video reference from EZ

An idea for the new gesture creator version? If that could work simulteanously with VirtualInMoov or in Real mode... https://youtu.be/ECWUPJ0oDaQ

supertick commented 6 years ago

For reference this is the current Servo Orchestrator - http://myrobotlab.org/content/servoorchestrator-least-try-my-call-testers

@LunDev did a excellent job of it, but it might need some love - but it did come up in the swing ui servoorchestrator

I'm not sure if anyone has a definition of what they want as a Gesture Creator vs the (more general?) Servo Orchestrator - but it would be helpful if people started to describe exactly what they need or want.

moz4r commented 6 years ago

Before to orchestrate things, we need to define them.

We need to define gestures only related to the skeleton first. then we can play with things like : speak("I'm happy") -> doSmile ()... creator should be a basical and easy toolbox for servoControl group and few things like forward gestures and speak. An option to export py code will be useful also.

I think orchestrator is a BIG thing , especially if we want to implement listener stuff like detection of the environment and program related actions. beer detected -> take the beer on the table -> bring it to human

My point of view is 2 thing : fow now a gesture creator/editor to easy script things with a basical UI, and an orchestrator for further

LunDev commented 6 years ago

Lemme provide some background information about the "original" ServoOrchestrator (now more than 4 years old <-- oops): The goal was to create an environment where you could place actions (at first only servos) on a strictly-timed grid. Inspired by e.g. DigitalAudioWorkstations, like Cubase, Ableton, Logic, Pro Tools or FL Studio. This background was the reason there are placeholder for "muting" a device, letting it "play solo", looping your selection (L/R) and also that annoying click-sound. Later you should get the option to export your "grid" (or a part of it) as a python script.

I abandoned the project because I was lacking the time to continue it (at this point of time there were school exams that wanted to be taken & an informatics competition (if I remember correctly, this year was my most successful participation)) - but I disgress. After I regained some of my free time, my incentive to continue this was very low, first this service is written like some form of shit in a very bad way and second it is way to complex, some magic arithmetics here, some random variable access there - it's got all of that stuff (btw: if nothing changed since I last looked at it, the InMoovGestureCreator should be in similar condition).

TLDR; The "original ServoOrchestrator" was a grid-based action scheduler.

If anyone should want to start the journey of rewriting (as in throwing this away and starting fresh) this, I have one really important advice: Stay organized, keep your variable references were they belong (and maybe ditch the drag-n-drop thingy, it's kind of complex and very magic).

~ Marvin

Alexinator40 commented 6 years ago

This service/gui will also need a way to dynamically change the limits of one servo based on the existing configuration of one or more servos. This will solve the advanced head mechanical issue where the vertical limits of the eyeball, which are affected by the eyelids, depend on the horizontal position of the eyeball. Custom algorithms should be able to be applied to determine how the limits are changed dynamically

LunDev commented 6 years ago

This is an interesting concept, altough I don't think it should only be implemented in a gui. Otherwise e.g. a call from a script would break something. In my opinion this should be handled by a service itself and the gui should use the (dynamically) published limits.