omigroup / ux-research

This is the new repo for the recently formed UX Research group in OMI. Check back weekly for updates on this repo and how to get involved.
Other
1 stars 0 forks source link

What are your ideas for problems worth solving? #12

Open mrmetaverse opened 1 year ago

mrmetaverse commented 1 year ago
  Lyuma - Where should post use cases?

Originally posted by @mrmetaverse in https://github.com/omigroup/ux-research/discussions/11#discussioncomment-3908026

mrmetaverse commented 1 year ago

My favorite idea to start comes from one of Shadows questions: "do portals work yet, and can I have a real metaverse field trip with my friends?" I think we could explore various ways this could work and probably get some platforms involved too :) I also created an interim Milestone for this concept. If we want to explore this, we should try creating a milestone around it instead of just filling a backlog.

https://github.com/omigroup/ux-research/issues/8

LightLodges commented 1 year ago

Cross-generational use case around all ages attempting to attend events together across devices (PC, VR, Mobile) -- how can this be as easy as possible for a wide range of people to participate in shared events in virtual spaces?

We have tested this in HUBS for weddings and funerals especially where the participants are all ages and not native to the web generally and found that many were unable to sort out 3D engagement and preferred watching on a 2D device or flat screen if possible. Is there a challenge to overcome for multigenerational events around ease of use across a wider range of devices? What are the best practices for designing public experiences for all ages and backgrounds?

LightLodges commented 1 year ago

UX research around WebXR media channels = how do people want to find, browse, discover and move to various content experiences made across the open metaverse?

LightLodges commented 1 year ago

UX of Portals -- my partner said he sees 3 specific types of portals across game/experience worlds:

  1. Stargate or Star Trek style dialing in of "Teleportation"
  2. Magic elevator/door/vehicle
  3. Walking through or clicking through a PORTAL like in the Portal game

Other trends we've noticed: they're often blue like links with some sort of glow to indicate an interaction or labeled so there's an easy way to designate that a portal is a multidimensional link. What are the best portal examples out there and what makes the portal experience easiest for users of all ages to engage together?

Altspace and a few other tools allow for group portals/teleportation: is this a growing interest/opportunity?

Elirudite commented 1 year ago

I have quite a few use cases that I think would be very beneficial to consider. Note that I am not limiting this to merely UI patterns, but real-world problems that I think the very act of attempting to solve will uncover UX principles and tools that will be far more inclusive and powerful. Please consider the following more as areas to research, rather than for problems to immediately build products around:

Belonging - How can people create their own virtual worlds that allow them to feel like they 'own' something real on the internet, rather than merely renting from big companies serving as virtual landlords? Ex: How can a kid create their own 'room' online where they have control over its design and access? What if I want a space where I can put my notes, favorite websites, virtual objects, photos, etc? How can I build that? How can I make sure its 'mine' and not under the ToS of some company? Ex: How can a community of people build out their own virtual world so they can see each other, share stuff going on in the neighborhood, and even facilitate advanced things like food access and transportation services?

Traversal - How do people find, discover, and move between virtual worlds? How can we build a mapping/navigation tool that is more accessible than Google, not completely based on algorithmic recommendation engines, and allows people to discover things that are fulfilling, rather than tied to some profit motive?

Interoperability - How do people move 'data' like their friends list, items, avatars and creations between virtual worlds? What if they want to showcase these items in the physical world (via a display, 3D printed, etc)?

Skills training - How do we make it easy for people to build, find, and/or use training simulations that give them real-world skills? Ex: I want to learn how to change my oil or do more advanced car repairs so that I can be a technician. How can I find a simulation that will teach me and let me practice building this skill? Ex: I am a mother who wants to get my kids more excited about school (or at least learning), what can I do to get them to care about history, math, science, etc?

Advanced community building - How can we bring together various technologies to empower people to solve their own hard problems? This may mean combining XR with things like IoT and blockchain, while also making it super accessible and easy to use. Ex: We are a community suffering from food insecurity and housing, how can we use advanced tech to build our own self-sustainable neighborhood, where we can grow our own food, build our own houses, and manage our own micro-economy? (This may involve bringing together big data tools to gather information about their geographic area and demographics, visualizing that data in a easy-to-understand way, finding people who can give insights on how to use that data, using those insights to build ideas, simulating those ideas with the data to predict possible outcomes, and implementing their solutions in the physical world).

Health - How can we make it easy for people to understand what impacts their health? Visualizing everything from genetics, to immune system, to diet and exercise, to environmental impacts, and so on. Ex: Why should I trust this vaccine? - Show a simulation that visualizes exactly what the vaccine does, and even shows who came up with vaccine and how. Building trust. Ex: How do eat healthier? Nothing seems to work for me - Visualize how different foods impact their specific body, and simulate what would happen if they ate certain foods on a certain frequency. Ex: What impact does climate change really have? - Show predictions of how their neighborhood will be impacted by flash food, fires, famine, etc... show the biggest polluters in their area. Ex: What's wrong with me, why don't I feel normal? - Show diagnosis for different mental health conditions, help people find the help they need, and/or show people what's it like living as a neurodivergent person.

Once again, for all of these use cases, we wouldn't necessarily be building the exact tools ourselves (at least not yet), but instead doing research into the problem space to really understand the issues and see how (or if) XR can help solve those problems. Then sharing our learnings for others (which may include our own working groups) to use those learnings to solve those problems. Some things we might learn could be the scope of the problem, how to build XR tools for that space in a way that is accessible and impactful, the specific accessibility considerations to keep in mind and how to begin addressing them, and so on.

Part of this will likely lead to talking to people in these areas, sharing their insights and issues, prototyping possible solutions, and creating some kind of easily viewable/editable living standards wiki for all of our research.

lyuma commented 1 year ago

Here's what I am most interested in. First and foremost, it's important to remember that the users of our systems are humans. Users are people, just like us, and they are all unique, with different wants and needs. As a philosophy, I like to consider that the needs of the people who use a system, outweigh all else in terms of the design and architecture.

( For an example of this philosophy, take an example of a pretty UI with green background. Imagine there is a user who finds that UI ugly or the green makes it difficult to read, I would prefer that the UI be changed to black on white and default fonts rather than have that user be unhappy to use the system. But this doesn't have to be black-and-white: there are other ways to handle this, such as allowing style overrides, permit modding, or provide settings. These approaches allow keeping the default as originally designed, while also allowing users who do not like it to make it meet their needs. )

Accessibility - a wide range of devices may be used to access these XR experiences, with a wide range of personal needs. Examples of accessibility work could be 3d labels, navigation (a sort of 3d taborder), equivalents to alt-text

-- In terms of devices, it could be anything from a conventional 2D user interface (a window or a web browser), to a 3DOF headset with no controller, to a 6DOF VR headset with two controllers, or a 6DOF AR headset with only hand tracking, and more. -- In terms of people, they could be blind; or deaf; or a stuck in bed (needs Horizon Adjust); or who cannot bend down or has limited mobility; or any other condition; and any combination of them. There are incredible opportunities for 6DOF tracked applications to be accessible to people, possibly with some combinations of haptics and other auditory or visual signals.

User interface elements - While I see OMI as helping bring worlds together by defining means for exposing resource locators (some sort of URL) between metaverse implementations, I do not know if OMI can or should define the standard portal appearance. I would expect different platforms to experiment with different metaphors of portals. I have nothing wrong with coming up with references for how portals could look. And having a reference for user interface elements that meets all the accessibility requirements would be good.

As an example of an UI concept which could see convergence, multiple researchers have brought up the concept of an "oriel" in XR applications, which is sort of an analogy of a 3D window. Dofdev describes the concept here: https://dofdev.org/dofs/oriel/ and https://twitter.com/opendegree/status/1535387864908943361 (It's worth reviewing the work dofdev.org is doing in the UI front: https://dofdev.org/ )

I could also see future work on intents (sort or URLs with bundled information which can be handled by various metaverse platforms, see android), ways to drag-and-drop data between metaverse applications... (eventually, having multiple metaverse applications open at the same time may be important to building the open metaverse).

Also imagine some sort of drag-and-drop interactions between metaverse platforms. For example, to take a piece of content from one into the other, perhaps through an oriel. While such a vision is a long way off, we may have instances of people wanting to drag content from their computer into their metaverse application. See StardustXR for examples of a XR-native display server which may have these sorts of interaction mechanics.

Finally, @Elirudite raises some interesting questions in the Belonging and Interoperability sections. To me, one of the more interesting parts of this problem is how to explain to people where their data is. I could see a relevant part of the open metaverse puzzle is to be able to explain to people where they are, who hosts their inventory/data, and who has access to their realtime voip, data and IK data... much like how the URL bar of a browser (often shortened in modern bowsers to just the domain name) allows users to know which website they are on, and by extension, who can see their data.

lyuma commented 1 year ago

I'll leave a comment with what sindhu wrote in discord, so it is accessible to people not on discord.

Consumer based use cases

Login-Authenticate Check Calendar Check email/Messages Attend Meeting Write Notes Access Documents Make voice calls Make Calendar appointments Create Documents

Check Bank Balance Pay Someone Review Financial Transactions

Teens Meet with Friends Send Messages to friends Play games w/friends Collaborate on school projects

Check School notifications Check Work notifications

Attend zoom classes Access School documents

Elirudite commented 1 year ago

We organized these use cases into shared themes (and further discussed each use case) in today's meeting here https://www.figma.com/file/L36As6RxgQ0VxfdOuLprgC/OMI-UXResearch_November112022?node-id=0%3A1 UXResearch-11-1Meeting

Elirudite commented 1 year ago

Please vote on your ideal use case in Figma here: https://www.figma.com/file/L36As6RxgQ0VxfdOuLprgC/OMI-UXResearch_November112022?node-id=0%3A1