secondlife / jira-archive

2 stars 0 forks source link

[BUG-231383] Nvidia Omniverse Connector #8869

Closed sl-service-account closed 7 months ago

sl-service-account commented 2 years ago

How would you like the feature to work?

Via an Omniverse Nucleus server Pixar USD scene descriptors can synchronize scenes between a metaverse such as Second Life and the Omniverse in real-time. Other platforms that also have Omniverse Connectors can then communicate in real-time for design, creation and interaction from anywhere in the world.

https://developer.nvidia.com/nvidia-omniverse-platform

This allows people to collaborate together on scenes. Someone using Unreal for example could see a Second Life region in real-time where geometry mesh, lights, and prims are placed and also where avatars are moving and how they are animating.

Likewise, someone using 3DS Max, Maya, Sketchup or Blender could start making adjustments to that region. And, everyone in the region would see the changes via the standard Second Life Viewer, as would the Unreal user. The Blender or Maya user could make adjustments and every Omniverse Connector would receive these updates as well. If the Unreal user moves, interacts, or makes changes to the scene this is also updated on both Blender, Maya and for users of the Second Life Viewer.

Currently Omniverse supported connectors: Autodesk 3DS Max Autodesk Maya Autodesk Revit Epic Game Unreal Engine Graphisoft ArchiCAD Kitware Paraview Mcneel Rhino/Grasshopper Reallusion Character Creator Trimble Sketchup

A permission system using Omniverse ACLs would need to be configured for connectors to honor whether or not they have read/write permission to the region they are interacting with or just observing. And, whether they have permission to view. Private parcels could restrict this access. Current inventory permissions could also be migrated to the ACL for restricting viewing, copying and modification.

The access control list features are explained here: https://docs.omniverse.nvidia.com/prod_nucleus/prod_nucleus/usage/acls.html

Permission to control a particular avatar for live interaction from another Omniverse connector (Unreal for example) could be provided via ACL as well, where the user would authenticate as a particular Second Life account that has ownership of that avatar stored in a Pixar USD file listing all avatars in Second Life, which is served by the Linden Lab's Nucleus server. Only the avatar controlled by that account could "login" and interact with other users via another connector.

Single sign on is supported: https://docs.omniverse.nvidia.com/prod_nucleus/prod_nucleus/installation/enterprise/sso.html

Omniverse has built-in support for WebRTC; audio, video calls and live streaming of media and web content come supported as well. And, WebRTC support would allow the creation of a microservice by Linden Labs to support real-time control of avatars.

User interface replication is also supported in Omniverse using OmniUI, therefore chat windows and inventory windows could also be implemented on other connectors as well.

Linden Labs would need to host a public Omniverse Nucleus server accessible by any Omniverse connectors on the internet where interaction could take place with Second Life's extensive universe.

https://docs.omniverse.nvidia.com/prod_nucleus/prod_nucleus/overview.html

Why is this feature important to you? How would it benefit the community?

An official Omniverse Connector for Second Life would embrace real-time design and interaction with any other supported tools and platforms. This opens up a world of possibilities and would represent Second Life as a definitive metaverse target platform.

This is some next level Ready Player One action. Support for the Omniverse platform would modernize Second Life and fling it far into the future.

Unreal Engine could replace the need for major viewer graphics updates for example to the main Second Life Viewer. And, access to metahumans would become available. Avatars displayed on legacy viewers could fall back on an alternative body chosen by the user. Optional integration with webcam face mesh tracking could be applied to make muscles in the metahuman face smile, laugh, frown, wink, etc. in real-time by the user.

https://docs.unrealengine.com/4.27/en-US/Resources/Showcases/MetaHumans/

https://www.youtube.com/watch?v=ViHg7XP8O7Y

Access to tools currently unavailable to Second Life, such as Reallusion Character Creator would now be possible.

Raytracing, Augmented reality and Virtual reality headsets at superior framerates and graphics quality would immediately become supported by Second Life.

Region designers would be gain the ability to collaborate with advanced editing and scene design tools in real-time.

And, Creators would suddenly be able to use many popular art tools in real-time to design products for the Second Life world.

Attachments

Links

Duplicates

Original Jira Fields | Field | Value | | ------------- | ------------- | | Issue | BUG-231383 | | Summary | Nvidia Omniverse Connector | | Type | New Feature Request | | Priority | Unset | | Status | Closed | | Resolution | Unactionable | | Created at | 2021-11-02T19:11:41Z | | Updated at | 2021-11-22T20:09:27Z | ``` { 'Build Id': 'unset', 'Business Unit': ['Platform'], 'Date of First Response': '2021-11-10T13:04:19.522-0600', 'How would you like the feature to work?': 'Via an Omniverse Nucleus server Pixar USD scene descriptors can synchronize scenes between a metaverse such as Second Life and the Omniverse in real-time. Other platforms that also have Omniverse Connectors can then communicate in real-time for design, creation and interaction from anywhere in the world.\r\n\r\nhttps://developer.nvidia.com/nvidia-omniverse-platform\r\n\r\nThis allows people to collaborate together on scenes. Someone using Unreal for example could see a Second Life region in real-time where geometry mesh, lights, and prims are placed and also where avatars are moving and how they are animating. \r\n\r\nLikewise, someone using 3DS Max, Maya, Sketchup or Blender could start making adjustments to that region. And, everyone in the region would see the changes via the standard Second Life Viewer, as would the Unreal user. The Blender or Maya user could make adjustments and every Omniverse Connector would receive these updates as well. If the Unreal user moves, interacts, or makes changes to the scene this is also updated on both Blender, Maya and for users of the Second Life Viewer.\r\n\r\nCurrently Omniverse supported connectors:\r\nAutodesk 3DS Max\r\nAutodesk Maya\r\nAutodesk Revit\r\nEpic Game Unreal Engine\r\nGraphisoft ArchiCAD\r\nKitware Paraview\r\nMcneel Rhino/Grasshopper\r\nReallusion Character Creator\r\nTrimble Sketchup\r\n\r\nA permission system using Omniverse ACLs would need to be configured for connectors to honor whether or not they have read/write permission to the region they are interacting with or just observing. And, whether they have permission to view. Private parcels could restrict this access.\r\n\r\nThe access control list features are explained here:\r\nhttps://docs.omniverse.nvidia.com/prod_nucleus/prod_nucleus/usage/acls.html\r\n\r\nPermission to control a particular avatar for live interaction from another Omniverse connector (Unreal for example) could be provided via ACL as well, where the user would authenticate as a particular Second Life account that has ownership of that avatar stored in a Pixar USD file listing all avatars in Second Life, which is served by the Linden Lab\'s Nucleus server. Only the avatar controlled by that account could "login" and interact with other users via another connector.\r\n\r\nOmniverse has built-in support for WebRTC; audio, video calls and live streaming of media and web content come supported as well. And, WebRTC support would allow the creation of a microservice by Linden Labs to support real-time control of avatars.\r\n\r\nUser interface replication is also supported Omniverse using OmniUI, therefore chat windows and inventory windows could also be implemented on other connectors as well.\r\n\r\nLinden Labs would need to host a public Omniverse Nucleus server accessible by any Omniverse connectors on the internet where interaction could take place with Second Life\'s extensive universe.\r\n\r\nhttps://docs.omniverse.nvidia.com/prod_nucleus/prod_nucleus/overview.html', 'ReOpened Count': 0.0, 'Severity': 'Unset', 'Target Viewer Version': 'viewer-development', 'Why is this feature important to you? How would it benefit the community?': 'An official Omniverse Connector for Second Life would embrace real-time design and interaction with any other supported tools and platforms. This opens up a world of possibilities and would represent Second Life as a definitive metaverse target platform.\r\n\r\nThis is some next level Ready Player One action. Support for the Omniverse platform would modernize Second Life and fling it far into the future.\r\n\r\nUnreal Engine could replace the need for major viewer graphics updates for example to the main Second Life Viewer.\r\n\r\nRaytracing, Augmented reality and Virtual reality headsets at superior framerates and graphics quality would immediately become supported by Second Life.\r\n\r\nRegion designers would be gain the ability to collaborate with advanced editing and scene design tools in real-time.\r\n\r\nAnd, Creators would suddenly be able to use many popular art tools in real-time to design products for Second Life world.', } ```
sl-service-account commented 2 years ago

Kyle Linden commented at 2021-11-10T19:04:20Z

Hello, and thank you for your feature request.

Incoming suggestions are reviewed in the order they are received by a team of Lindens with diverse areas of expertise. We consider a number of factors: Is this change possible? Will it increase lag? Will it break existing content? Is it likely that the number of residents using this feature will justify the time to develop it? This wiki page further describes the reasoning we use: http://wiki.secondlife.com/wiki/Feature_Requests

This particular suggestion, unfortunately, cannot be tackled at this time. However, we regularly review previously deferred suggestions when circumstances change or resources become available.

We are grateful for the time you took to submit this feature request. We hope that you are not discouraged from submitting others in the future. Many excellent ideas to improve Second Life come from you, our residents. We can’t do it alone.

Thank you for your continued commitment to Second Life.