Using VR Tools To Design VR Games

Epic’s Lauren Ridge plays with the Unreal Engine VR Editor
 
Virtual reality games are enjoying a boom at the moment. Many developers, both AAA and indie, are exploring the possibilities of this exciting medium. But by and large, their immersive 3D worlds are being built with the same 2D tools that were used to create video games for flat monitors, TVs, and touchscreens.
 
Will devs one day create 3D worlds within a 3D workspace? Some game companies are currently trying to make that a reality. At the forefront are Unity and Epic, who both offer VR authoring tools that work with their existing game engines. The Unreal Engine VR Editor and Unity EditorVR are both currently available for users of those engines.
 
Why use VR design tools?
 
Unity Labs principal designer Timoni West believes using virtual reality to create video games is the natural next step. It will be more intuitive and accurate. It’s very difficult to describe 3D objects in 2D, she said, which is why developers rely on software like Blender or Maya. But with VR, devs can see 3D objects in their natural environment and manipulate them.
 
“Let’s say I want to draw a Christmas scene in VR,” West explained. “Right now, I have to set up the scene in Unity in 2D, use the grids to align things, constantly move the screen back and forth using the hot keys to pan, or other keys, to place things as precisely as the mouse and keyboard. And people get really good at this. They get really fast at it, but it takes years and years of practice and knowing how to translate what you’re trying to do in two dimensions across three dimensions. But if you just put on the headset and go to VR, you can decorate your Christmas tree just like you would in real life, and that’s a big advantage.”
 
Lauren Ridge is a technical writer for Epic Games who works on the Unreal Engine’s VR Editor. Over the past year, she’s collaborated with the team to add new features and improve user experience. Like West, she believes that seeing and manipulating objects in 3D can give game developers an edge when crafting virtual reality experiences.
 
“If you’re making a VR experience or a VR game, sometimes it’s hard to really feel if [objects] are right until you’re actually in the position of the VR player,” said Ridge. “So, maybe something is too close to you, maybe something’s not quite the right height for your reach, and being actually immersed in that environment let’s you tell that without having to iterate a lot between the desktop editor and playing the game in VR.”
 
Another potential benefit of VR editors is that they can make a developer’s work a little more fun. Although Ridge says her team isn’t trying to “gamify” game-making, she believes people will enjoy seeing their game worlds build up around them. “It really shifts your perspective from just looking into a window at it,” she said. “We have inertia on objects, so if you’re moving, say, a ball around the world, you can throw it with a gizmo and it actually has a little bit of physics to it. And things like that really give it more of an immersive, engaging feeling while you’re designing that way.”
 
What does game design with VR tools look like?
 
During the keynote address at Unite 2016 in Los Angeles, West took to the stage to show off how EditorVR works. (You can see it at the 1:48:00 point in this video.) Using Campo Santo’s first-person adventure Firewatch as an example, she used a Vive headset and controllers to easily and fluidly place objects, such as a coffee mug and typewriter, into the scene and change their positions and rotations.

,

,

Fiddling around with Firewatch’s level design in Unity’s EditorVR
 
“We’re just trying to put as much of the functionality that already exists in Unity directly into VR so you can just do it all in there, and you don’t have to get out of your workflow or take off your headset,” says West.
 
Meanwhile, Epic co-founder Tim Sweeney himself has demoed how his company’s VR editor can be used in a video:

,

 

Source: Gamasutra

more insights