I originally saw the video above in May 2015 when The Void released the video and was given the spotlight by Virtual Reality enthusiasts. The ideas behind it reflect the future of virtual reality entertainment. Their goal is to make you feel like you are inside the virtual reality world. In order to accomplish this, they’ve integrated small elements into the landscape like heat, wind, water, texture, and movement to make it feel like you are experiencing what you see in the virtual world.
Virtual reality and augmented reality technologies are becoming more accessible and gaining popularity. A lot of companies like Oculus Rift, Microsoft HoloLens, HTC Vive, and Google Cardboard are investing in the space. While most companies are targeting home usage, it’s hard to feel truly immersed in a virtual world while sitting on your sofa or standing in your living room. The interactive environment with haptic feedback helps to truly forget where you are.
I just hope that the experiences of all Virtual Reality headsets live up to our imaginations.
I have a friend who is really into Virtual Reality development and he introduced me to the Google Cardboard Kit. Google Cardboard is able to enable accessible VR tools through the use of smart phones as the basis for the platform. Since most people have smart phones, but very few have VR headsets, a simple and cheap conversion add-on was beneficial. I figured that with its cheap barrier to entry, I could at least test it out and as soon as my kit arrived, I started to do a bit of development for my iPhone 5 in Unity for it.
I got this kit for $26 and unlike the other cardboard ones, this was made of plastic and had better construction with adjustable focus and lens separations.
Here are some of the thoughts I had while developing for and trying out the Google Cardboard SDK for Unity:
It’s awesome for game developers because it provides them with an easy way to take a 3D game and turn it into a VR experience! (as long as its been developed in Unity)
BUT theres a very limited way to interact with applications right now (aside from moving your head). The only real interaction is to use the “trigger” which is a magnet on the side of the headset. I’ve seen videos of people hooking up controllers to their Android phones, but it would be amazing if there were better guides on how to set up a wireless controller, Leap Motion, or Myo armband for interactions from a developer standpoint.
The Unity SDK really limits the functionality of the applications right now. For example, it’s almost impossible to figure out how to stream video to it from a server, or create a simple socket connection to a server in order to provide input / data. (I am trying to use the headset to move a servo that is connected to a WebCam for remote viewing)
Overall, I learned a lot going into a quick dive into developing a Unity application and using the Google Cardboard SDK, but I’m not sure that it’s ready yet for me to go much further. The biggest issues for me are the limited interaction and the lack of a usable cross-platform networking stack. I’m sure that it will get better, but for now I’ll just have to be happy playing a couple simple games and watching Youtube 360 videos.
If you want to see what I did as an entry point:
I used this “Roll-a-Ball” tutorial in order to get accustomed with developing in the Unity environment before I built this VRCameraDemo that take my phone’s back facing camera and places it on a plane in front of the viewer to recreate what it sees. (Not very “virtual” reality)