Archive for May, 2013

Building iPad VR Headset

Since hearing about the Oculus Rift kickstarter I’ve been interested in having one of my own. From the description and the posts I’ve seen others building their own online it looks like this is something pretty easy to do.

So I did, because I had most of the difficult pieces in my iPad. My goal was to build this without using anything I needed to acquire from ebay or other sellers I didn’t trust.

Here’s how mine was made.

Optics and Prototyping Without Code

I first started by searching for lenses. The best I was able to find were some 2″ diameter 4x Loupe lenses which sell for $5 each at Fry’s electronics. These didn’t have enough magnification because through a single set I was able to see beyond the edges of my iPad when positioned far enough away to be in focus. I ended up putting two of these lenses on each side. The lenses were mounted in foam core cut to the outline of the lens (using the plastic ring the lenses were mounted in as a guide), then taped together using gorilla tape and masking tape. The distance the lenses were placed from the iPad was chosen to have the iPad appear in focus. I didn’t do any real calculations to build this correctly.

V1 iteration of lenses to find focus distance. The lenses are taped to the packaging they came in.

Once I had lenses to play with I started working on getting an image or video working so I started with the Crytek Sponza and a stereoscopic camera script. I even did the lens distortion using the lens distortion node in the blender compositor.

First iteration of 3d was prototyped using blender, and displayed on iPad using XDisplay

First iteration of 3d was prototyped using blender, and displayed on iPad using XDisplay

I later worked on better lens mounting and purchased additional lenses so that I would have enough magnification to get a reasonable field of view.

A lens mounted in a hole cut in foam core

Mounting “hardware” for a single lens using a hole cut in foam core.

IMG_1179

Distortion caused by a single lens. Also notice the red/blue accents caused by cheap lenses.

Software

I started working on using the internal gyroscope on the iPad to rotate the world so that the view would change as the device is rotated around. This was easy because the documentation of the gyroscope is well written. The only potentially interesting parts are the code to copy floats from the CMRotationMatrix to GLKMatrix4.

Next, I worked on the software side to get the inverse lens distortion happening on the displayed image so when viewed through the lenses the world isn’t distorted. I did this by playing online with different formulas in a WebGL shader editor (the old shadertoy) by creating a shader that altered the distance of the texture sample from my distortion center using just a simple quadratic (essentially, sample from dist + constant*dist*dist from center instead of just dist from center). This looked okay when testing the same lenses on my laptop so I ported it to my iPad and tweaked the constant there with a regular grid until the grid as viewed through the lenses looked like straight lines. The iPad version of this application started with the OpenGL ES game project template and hasn’t been cleaned up much since.

I found sample code online to render to texture and soon had a textured cube rendering to a texture for each eye. This version allowed the player to look around at a skybox, but there wasn’t any other input to actually do any walking around. I created the skybox by extracting the original images from a Photosynth for iOS panorama. I’ve put code to make a single image from a photosynth panorama.

I used this panorama to calibrate the field of view. I took a photo from the panorama location in a direction with some recognizable objects, and tweaked the field-of-view in the application until a photo through the lenses displayed the same view of the world.

IMG_1192

Panorama from a nearby park used as first skybox. (Photosynth created this image, I used a cube one extracted from Photosynth tiles)

Photo taken in park to calibrate field of view

Photo taken in park to calibrate field of view. This was taken on a different day from the panorama above.

Photo taken through lenses of iPad environment after field-of-view tweaking

Photo taken through lenses of iPad environment after field-of-view tweaking

A Game Controller?

To create a game controller, I was originally wanting to use the microphone line on the headphone jack connected to an arduino based game controller. I looked for code for this and found more than one Arduino soft modem implementation which synthesized sounds on the arduino which were translated into bits on the iOS side over the microphone jack, but I stopped persuing this because they were quoting data rates of 50 bits per second. This would be sufficent for a single button, but not at all enough bandwidth for a two axis analog stick and a handful of buttons.

At GDC I met Jon Manning who wrote a book on Objective-C and Cocoa and he suggested using Bluetooth Low Energy. Searching online I found the RedBear lab BLE shield for only $35 on the Maker Shed. Their sample applications and code were easy to modify to my needs. My Arduino sketch sends 3 bytes at a time for the control data and this works well.

Game controller iteration 1. Polymorph on bottom, mintyboost on left, arduino uno, RedBearLab BLE shield, sparkfun protoshield, and sparkfun joystick shield.

Game controller iteration 1. Polymorph on bottom, mintyboost on left, arduino uno, RedBearLab BLE shield, sparkfun protoshield, and sparkfun joystick shield.

When I was thinking about this before I wanted a mesh to work with so I started with the Sponza from the Crytek models page. Unfortunately this was sluggish on my laptop and I wanted some baked lighting so I built a lower polygon version of this and baked the original textures and some terrible lighting to a single 1024×1024 texture. My lower polygon version is about 5000 triangles. I built a simple export python script that just spits out the float array used in the sample code. This means that the mesh is in data as individual triangles rather than the much faster to render indexed triangle strip, but the model is simple enough this isn’t a problem yet.

Next Steps?

Honestly, I think the next step is to scrap all the code I’ve written so far and build the iPad side again in Unity3d. Unity3d would make it a lot easier to turn this into an actual game where I can make new models and textures happen without having to fiddle with the code. Unity also will do a bunch of the boring code things I don’t want to write like building and using indexed triangle strips and compressed textures. That said, I’m not sure if I will be able to access CoreBluetooth from Unity3d.

, , , , ,

1 Comment