Dialexa executives Brad Bush (Chief Operating Officer) and Steve Ray (Chief Creative Officer) recently sat down together to consider how user experience and interaction will evolve in the worlds of VR and AR. Here’s the conversation that took place.
Brad: This transition to VR/AR is similar to how we moved from doing things manually to doing things with computers. We started with tools that were representative of real-world objects—like having a floppy disk as the save image.
Younger generations don’t even recognize what a floppy disk is anymore, so we’ve started to move away from these real-world images. But do you think user experience in VR will trend back toward realistic representations?
Steve: That’s a great question, because if you think about how computers were introduced to the average home, it was a big shift to think of how manual processes could go digital. It was the visual interface, and representations of real world objects that introduce familiarity around performing functions and tasks like file management, organization, and things like dragging something to a trashcan.
It’s this idea of skeuomorphism—making the digital user experience take the shape of the physical world. As we move into virtual/augmented reality, part of the shift back to representing physical objects will be removing interfaces altogether. It will be a while until we see this level of comfort, but the time will come when users are comfortable without VR interfaces because the real world is without any interfaces or really IS the interface.
Brad: that kind of seamless representation of the real world, but you still want to keep some of your digital superpowers, right? Multitasking, powering through an Excel spreadsheet, things like that. There are just some things we can’t do in the real world.
And that being said, there are two things I’ve noticed about interfaces and user experience in VR gaming. One, that placement of screens in messages is a work in progress—how do you make sure you know where people should look for a welcome screen or a new message? We have even seen messaging and interfaces in the “deadzone” where you look down. And two, how can you indicate that an object is interactive versus something that isn’t?
Steve: Those questions just bring us to the fact that we’re still creating and defining new interaction patterns for virtual/augmented reality. These are the kinds of decisions we have to make when thinking about designing a great AR/VR experience—do we want a message placed in front of users and following them? Do we want things to be directional, where the users have to turn to their left or right to see some sort of menu or navigation element? We’re still in the wild west of VR experience design; the laws and conventions are still being written.
In order to validate our decisions we’ll go back to rapid prototyping and research that are fundamental to our design process. The process doesn’t change—but with VR/AR we’ll have an easier time watching how people interact with environments and applications. We’ll start to learn if hand gestures or voice input is more natural or whether interaction with menus is even necessary.
Brad: Let’s unpack those interactions. There are really two basic patterns—floating, interactive items that are connected to users vs. world-connected interactive objects. The problem is that depending on the use case, either option could be best.
You might want to throw something in the trash bin in VR and have that represent the real-world trash bin. In that case, the VR trash bin can’t move. But in the case of screens and menus, it might make sense to have those floating and following users as necessary.
And then we want to talk about the ways users will come to interact with VR/AR. Voice and gestures are really the two options, but today’s technology is mostly doing pseudo-gestures, where you can look at a menu and select options with a controller. This is how the Oculus Rift works now, but the HTC Vive has support for full gesture, where you can reach out and select an option with a trigger.
But tomorrow’s interactions will start to incorporate eye tracking so you can actually blink to select menu options. This will leave us with pure eye control, eye/controller control, full gesture, and voice as our options. What do you think about those?
Steve: Well like you said, the conversation right now is between pure gesture and voice. The goal is to be as human as possible. We started thinking about this a few years ago in the context of mobile apps—what would happen if we eliminated navigation altogether. Could people still figure out how to use the app?
With virtual/augmented reality, we have a better opportunity to make things more natural. We need a new term other than skeuomorphism—we want to make the world so real that it disappears. Whether that happens through pure gestures, voice, eye tracking or something else remains to be seen, but we’ll probably see a combination of everything depending on the use case.
Brad: I think you’re right, that we need to adapt to the needs of users as we learn more about what people prefer. But along those same lines, what if our 3D environments could start representing a more convenient version of the real world? What do you think of this idea of setting up various environments within virtual reality and actually saving them for future use?
Steve: I think that’s going to become increasingly important. Whether you’re aiming for leisure, productivity, or something else, being able to buy or create a virtual environment and revisit it whenever you want is great. Just think about the classic scene where someone is lying on a beach reading a book—you could put yourself there whenever you wanted.
This really just fits into the ongoing trend toward software development kits and APIs that limit the amount of programming we actually have to do. We still have work to do to get to a point where anyone can create new objects regardless of their technical skill, but eventually we’ll have libraries of VR objects to work from individually.
Brad: So if we’re talking about VR/AR in terms of the point when we’ve reached more universal adoption, let’s change gears a little for this last topic. What do you think about social interactions inside VR and how that user experience will look?
Steve: It’s easy to think of this idea now and feel a little strange—creating the most realistic representation of real-world social interactions will take some getting used to. But there’s a lot of value there.
Whether it’s for business or a more personal interaction, there are just some meaningful conversations we want to have face-to-face. With innovation around eye and gesture tracking, we’ll be able to have these conversations in VR.
From a personal perspective, I remember a time when my mom came to visit me and I took her to a coffee shop. I can still recall the jazz music playing in the background and the nice setting—having this kind of interaction in VR will be huge.
Brad: That’s a great point. But I feel like if there’s one thing to take away from this whole conversation, it’s that we still have a long way to go in terms of technology maturity and just people getting used to VR/AR in general. But hardware and software are making great strides and maybe in the next 2 or 3 years VR/AR will start to gain some serious, mainstream traction.
Regardless, though, it’s been great talking to you about how user experience will change as virtual/augmented reality take shape. Thanks, Steve!
Steve: It’s definitely an exciting time to be working in technology and design. Thanks, Brad!
Getting Out Ahead of the Virtual/Augmented Reality Curve
There’s no doubt that we’re on the edge of massive growth in virtual/augmented reality. If you want to get (and stay) ahead of your competition, you have to start thinking about how VR/AR will fit into your business and start experimenting now.
Still aren’t sure how VR/AR apply to your business? Download our free ebook, 11 Tasty VR/AR Recipes, for a deeper look at some of the top business use cases for the technology.