Open seanedwards opened 2 years ago
Second this!
I'm currently making an OSC drone system to sync camera coordinate from Unity into VRChat. If we have a way to output player position/rotation back into Unity, we could do player tracking or focusing automatically, It would be highly beneficial for VRChat machinima or cinematic filming.
OSC Drone Demo: https://twitter.com/YanKMW/status/1497074496867807232
an YAngle
to complement AngularY
would be great as well.
This would be great, it would allow for Quest-compatible world space objects on avatars (as said above) with only the need for avatar support and a mobile app or computer program :+1:
*What's the idea?** Basically, similar to VelocityX/VelocityY/VelocityZ parameters, I would like to be able to read PositionX/PositionY/PositionZ parameters.
Using the world space constraint technique, this would allow me to create network-synced world space props that work for late joiners, by using animator parameters to represent the X/Y/Z position of the prop. In order to know where to place the prop, I need to be able to see the coordinate system of the world, and reporting the current player position seems like a good reference point to use.
I have a prototype of this prefab up here: seanedwards/vrc-worldobject
Is there another way? I can probably get the world space position another way, such as a gameobject+shader that renders its own world space orientation/position.