In the first part of the series “Тhe current state of the UX of VR”, I made a general overview of the challenges of creating an intuitive UX of VR. Now, I decided to focus on practical tips and examples that will help you get started with UX of VR. This blog post is a summary of my experience as well as helpful articles and researches of other UX designers experimenting in the same field.
How to position content in VR?
The first step, when you start designing a UI screen in VR, is to decide where to position the content. If you have tried VR before, you might have also noticed that most screens are placed right in front of you. The space behind you is called “curiosity zone” because you would look in this direction only if there is some kind of a trigger. Turning back, especially if sitting down, does not feel comfortable, so the best place to display interactive information is in the front.
I will use a research by Alex Chu to further define the best area for positioning the content. His team discovered that the comfortable range of head rotation is 77 degrees left and right, 66 degrees up and 12 degrees down. To interact with elements placed outside of this zone, the user needs to twist his head.
Also, any content within a radius of 0.5 m of the user’s sight is hard to read and creates a feeling of crossed-eyes, while any objects placed on more than 20 meters looks flat. Thus, the best zone for showing interactive objects is in between 0.5 m and 20 m from the user.
Large screens in VR
Once you have defined the comfortable content zone, start arranging the information you want to communicate. Most of the experiments with UX of VR show that large screens do not work well. Standing in front of a gigantic screen is overwhelming and makes you constantly turn your head left and right (especially, when there are long lines of text). In VR you need to limit the content you are communicating, positioning it in narrow columns.
Another tactic that will make large screens comfortable was discovered by the UX designers at Oculus. After dozens of iterations, they came up with the idea to use curved screens instead of flat. Thus, all elements are at an equal distance from the user and the content is easier to read.
Keyboard in VR
Another dilemma you will need to resolve is how to position interactive elements such as keyboard and buttons so they are visible. The decision should be considered with the type of VR technology you are developing for. For example, if using Gear VR or HTC Vive, you could place the keyboard further away from the user as he will be interacting with a laser pointer. However, if developing for Leap Motion, you will need to place the keyboard closer to the user. Watch Mike Alger’s video if you are going to use Leap Motion.
My experience is with designing the keyboard for LensVR WebVR browser for the GearVR headset. At first, I decided to make the keyboard large and placed it at the bottom of the browser screen since I did not want it to cover the whole screen. However, when I tested it in VR, I discovered that it was impossible to type in letters and see the URL bar at the same time. Thus, it was easy for a user to make mistakes and the result was visible only after he has typed in the symbols and looked up to see the URL bar. In the second iteration, I made the keyboard a bit smaller, moved it closer to the user, so it was still clearly visible, and right beneath the URL bar. Here is the result:
Where should you place buttons and menus?
In web design, UX designers usually align the buttons to the left or to the right. In VR however, this design pattern does not work well – content placed towards the corners of the screen falls out of the user’s focus area and looks blurry. Thus, you can just put the buttons, and any important information, in the middle of the screen. It will be sharp and clean, and the user will not need to turn his head to interact with the content.
Another thing to consider when designing for VR is if an event should be triggered on press down or release. I would recommend that you go with the press down event. With LensVR keyboard prototype, for example, I initially used the release event because I thought it would prevent mistyping letters. I imagined that the user will have time to focus the pointer on the right letter before choosing it which would prevent pressing the other buttons while trying to position the cursor. It turned out to be just the opposite – the event was triggered on “release” button – at the very moment when I was moving the pointer to the next letter, which led to me ending up with the wrong letter. Also, the keyboard felt slow and unresponsive.
Tone down the colors
In web design, using vivid colors is a popular technique to signalize important information, but in VR where the surrounding and all UI elements are in a real-size, they feel bright and intrusive. I’ve made an experiment, testing several 360 photos with users. They pointed out the images in pastel and toned down colors as more appealing and more natural.
How UX of VR is going to work is an exciting topic. To discover even simple best practices, like those mentioned earlier in the blog post is required thorough testing. In my next blog post about UX of VR, I will share the results of a UX experiment we’ve done. Sign up for the newsletter and never miss a blog post!
Author: Billy Vacheva, @BilyanaVacheva