Inochi2D / inochi-session

Application that allows streaming with Inochi2D puppets
https://inochi2d.com
BSD 2-Clause "Simplified" License
278 stars 19 forks source link

[Feature Request] OSC tracking #41

Open IcedQuinn opened 1 year ago

IcedQuinn commented 1 year ago

Validations

Description

Support for the Open Sound Control (OSC) protocol as a tracker source. Possibly as a control plane for the rest of the program as well.

Benefits

An external panel can then be used to change parameters (via OSC) such that the character shows a cartoon sweat effect while a terrified button is held.

Suggested solution

Support OSC control messages to control the software, and/or as a tracker input.

Alternative solution

No response

Additional Context

No response

LunaTheFoxgirl commented 1 year ago

OSC is already supported via the VMC standard set of mappings, does these other OSC software use a different OSC chunk layout?

For reference VMC is just a set of defined OSC paths and value orders at those paths, as such vmc-d is already an OSC client (and server)

IcedQuinn commented 1 year ago

the VMC standard set of mappings

i had non-standard uses in mind.

for example a rigged blob emoji with a "normal" and "scared" state. the trackers don't know when to do this animation, so a stream deck button is rigged to report scared as 1 while held. Then the tracker software transmits face data to session, the hold button on their desk transmits if the emote is on or not. from a music background this is one of the most obvious ways.

does these other OSC software use a different OSC chunk layout?

you tell the trigger and orchestration software what bundles to send, generally.

For reference VMC is just a set of defined OSC paths and value orders at those paths,

i spent a few minutes trying to find the spec for what bundles to send for that; ended up on japanese pages that went around in circles. :/

LunaTheFoxgirl commented 1 year ago

the VMC standard set of mappings

i had non-standard uses in mind.

for example a rigged blob emoji with a "normal" and "scared" state. the trackers don't know when to do this animation, so a stream deck button is rigged to report scared as 1 while held. Then the tracker software transmits face data to session, the hold button on their desk transmits if the emote is on or not. from a music background this is one of the most obvious ways.

does these other OSC software use a different OSC chunk layout?

you tell the trigger and orchestration software what bundles to send, generally.

For reference VMC is just a set of defined OSC paths and value orders at those paths, i spent a few minutes trying to find the spec for what bundles to send for that; ended up on japanese pages that went around in circles. :/

You send bones (Name: string) (XYZ: float3) (Quaternion: float4) to /VMC/Ext/Bone

You send blendshapes (Name: string) (Value: float) to /VMC/Ext/Blend/Apply

Inochi Session will accept any blendshape and bone names and will display them in the list and allow you to use them from expression bindings.

IcedQuinn commented 1 year ago

You send bones (Name: string) (XYZ: float3) (Quaternion: float4) to /VMC/Ext/Bone

You send blendshapes (Name: string) (Value: float) to /VMC/Ext/Blend/Apply

Inochi Session will accept any blendshape and bone names and will display them in the list and allow you to use them from expression bindings.

I'll take a look.

IcedQuinn commented 1 year ago

I tested this with Ossia Score 3. It crashes Session upon sending OSC packets to the UDP endpoint.

icedquinn@astaraline ~/D/inochi-session-linux-x86_64 [SIGKILL]> ./inochi-session
[INFO] Inochi Session v0.8.0, args=[]
[INFO] Lua support initialized. (Statically linked for now)
[INFO] Scanning plugins at /home/icedquinn/.config/inochi-session/plugins...
[INFO] Found zone Yass
[ERR ] Could not start texture sharing, it will be disabled. Is the library missing?
core.exception.ArrayIndexError@../../../.dub/packages/vmc-d-1.1.3/vmc-d/source/osc/message.d(140): index [1] is out of bounds for array of length 1
----------------

image

IcedQuinn commented 1 year ago

Did some testing and it looks like the issue is that sending /VMC/Ext/Blend/Catte with a float value causes session to flip out since it does not expect this. That is probably a separate bug to be filed.

I tried again by this time sending to /VMC/Ext/Blend/Apply which does not trigger a crash, but also doesn't seem to be reading the input. It looks like the VMC endpoint is expecting a remote procedure call style where an entire bundle is sent to the endpoint, whereas Ossia et all are using a traditional OSC layout where the path name is the name of the parameter and sending a single float there changes the value.

It might be possible to get the other tools to produce the RPC-like calls with adapter code on that side. That isn't quite how OSC automation is meant to work and will take some digging and bodging on my part.

IcedQuinn commented 1 year ago

It looks like the VMC endpoint is expecting a remote procedure call style where an entire bundle is sent to the endpoint, whereas Ossia et all are using a traditional OSC layout where the path name is the name of the parameter and sending a single float there changes the value.

Meaning, a traditional OSC tracker would use the path as the name of the sensor. They would send something like /temperature with a float as their parameter. Contrast to VMC which is using the path as an HTTP-like function name and expecting the name as a parameter to a function call.

grillo-delmal commented 1 year ago

I was looking into this checking the VMC spec and the code

From the spec (VRM BlendShapeProxyValue), what is expected is to receive a "string, float" pair in the /VMC/Ext/Blend/Val endpoint that would be stored in the application until it receives the /VMC/Ext/Blend/Apply, which would cause to apply all of the values at the same time.

The VMC adaptor implementation receives "string, float" pairs on whatever endpoint that starts in /VMC/Ext/Blend and applies the blend inmediately. It also ignores whatever you send to /VMC/Ext/Blend/Apply, which is enough to make inochi-session being able to interact with other software that implements the protocol (I have played extensively with this), but not up to standard.

Since it expect the blend name as part of the message, it might not fulfill your specific needs, but if you can send the key value pair to /VMC/Ext/Blend it should be enough to be able for session to use a custom blendshape using that tracker.

grillo-delmal commented 1 year ago

Now on the topic of adding a custom OSC tracker, besides the configuration ui part, I don't think it would be hard to implement.

An idea would be to add a /I2D/Blendshape/* endpoint and a /I2D/Bone/* endpoint that could receive just floating point values and let session integrate the blendshape/bone names as it does with other protocols.

IcedQuinn commented 1 year ago

Since it expect the blend name as part of the message, it might not fulfill your specific needs, but if you can send the key value pair to /VMC/Ext/Blend it should be enough to be able for session to use a custom blendshape using that tracker.

Correct. This is the "RPC-style" use of OSC I was talking about. It's not illegal OSC, but it's not the way the music and interactive art folk have typically done it. So the tools don't support it very well and I'll need to get with the other devs on that.

An idea would be to add a /I2D/Blendshape/ endpoint and a /I2D/Bone/ endpoint that could receive just floating point values and let session integrate the blendshape/bone names as it does with other protocols.

Yup.

In traditional OSC you would have separate endpoints for the parameters as well, like /Bone/Roll or /Bone/X. I think its allowable to do both where sending a message to /Bone can receive a bundle while /Bone/X receives a singleton.