Mixed reality is one of the most impressive uses of virtual reality technology we’ve seen so far. By using a green screen and inventive camera setup, users can project the game world to the screen and let it surround the player — as if they are actually standing inside the VR experience itself. This not only looks great in video, but it also does a wonderful job of communicating what it’s like inside the headset.
Back in October, Owlchemy Labs, the developers behind the critically acclaimed and commercially successful Job Simulator and upcoming Rick and Morty VR game, announced plans for iterating and improving on mixed reality capture technology. Now today, it’s announced the follow-up to their developments, which includes incredible dynamic lighting, automatic green screen bounding, and transparency features.
UploadVR reached out to Owlchemy Labs CEO Alex Schwartz about these new features and how they will work with other non-Owlchemy games and applications. “It can be dropped into any app with zero integration and will work for any 3D content in Unity. At this moment we’re looking for people to try it out in a closed beta and the licensing model is yet to be announced.”
,
,
Ceilings, walls, clamps, and lights always seem to enter frame at the worst time — we have a solution!
* Shooting mixed reality footage generally results in shots where mounts, lights, and non-green-screened areas come into the shot, ruining the footage. Our tech uses a new smart green screen bounding technique to reduce the required green screen coverage so you don’t need to worry about pointing the camera away from the green screen!
* This enables new types of shots to be pulled off without worrying about panning out of the green screen area, such as wide pans or 360 turnarounds. Now the camera can be pointed in all directions without needing a green screen in view!
* This also means that very small green screen setups can be utilized much more effectively! Supporting lower end setups will help immensely with wider adoption of mixed reality so that bedroom Youtuber and Twitch streamers will be able to produce quality content at very low cost without high end screens or professional lighting rigs.
And finally, it has decent transparency support.
“This is the first time transparencies have been able to be drawn and sorted at proper depth in mixed reality,” Owlchemy writes in its blog post. “It finally just works how it should. The limitations of prior implementations of mixed reality have prevented transparencies from functioning but now that we’re in-engine, we can tackle this issue, anyone who has tried this before will tell you it is extremely complex to pull this off, let alone real time. With proper alpha blending, we can also achieve amazing effects with particles such as having someone stand behind a wall of smoke, in the rain, snow, or any complicated transparent scene.”
,
Source: Venture Beat