We have been exploring the control algorithm which maps piano midi messages to UR5 control commands. The piano playing robot can thus mimic the human pianists' expressive playing by replicating the midi signals generated by them. Can the robot have feelings about a piece of music by itself? Can the robot even create some music when it is under some emotion, like a human pianist would do?
By integrating the research on:
piano playing robotic hand control
body movement generation of piano playing robot
Li, Bochen et al. “Skeleton Plays Piano: Online Generation of Pianist Body Movements from MIDI Performance.” ISMIR (2018).http://ismir2018.ircam.fr/doc/pdfs/109_Paper.pdf
music composing with deep learning
Sturm, Bob L., et al. "Music transcription modelling and composition using deep learning." arXiv preprint arXiv:1604.08723 (2016).https://arxiv.org/pdf/1604.08723.pdf
robot facial expression generation
M. Han, C. Lin and K. Song, "Robotic Emotional Expression Generation Based on Mood Transition and Personality Model," in IEEE Transactions on Cybernetics, vol. 43, no. 4, pp. 1290-1303, Aug. 2013, doi: 10.1109/TSMCB.2012.2228851.https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6376252
We could possibly build a robot model with expressive piano playing ability. A chaotic system, like the Bernoulli ball, could probably be used as the seed of robot emotion generation. The composed music, MIDI message, body movement, finger movement, facial expression are then generated accordingly. By implementing the model to a 3D piano playing humanoid robot model, we could possibly perform Turing test. Will the candidates be able to tell if the pianist is a robot or not?
We have been exploring the control algorithm which maps piano midi messages to UR5 control commands. The piano playing robot can thus mimic the human pianists' expressive playing by replicating the midi signals generated by them. Can the robot have feelings about a piece of music by itself? Can the robot even create some music when it is under some emotion, like a human pianist would do?
By integrating the research on:
We could possibly build a robot model with expressive piano playing ability. A chaotic system, like the Bernoulli ball, could probably be used as the seed of robot emotion generation. The composed music, MIDI message, body movement, finger movement, facial expression are then generated accordingly. By implementing the model to a 3D piano playing humanoid robot model, we could possibly perform Turing test. Will the candidates be able to tell if the pianist is a robot or not?