This is the HMD I showed at the VanVR April event. It is a fully operational display that incorporates the following design improvements:
– Clear plastic construction: This seemingly trivial improvement actually has some strong benefits for wearing headsets in complex environments, say a home setting with pets, children and other moving obstacles. By being able to retain your peripheral vision you are able to anchor yourself in the real world while enjoying a virtual world in your central field of view. I received many positive comments from “long time” Oculus wearers on this approach and only one tester commented that they prefered the blackout experience out of the twenty or so who tried it out.
– Adjustable Interpupilary Distance: By mounting the lenses in 3d printed nylon (Taulman Bridge) clasps and constructing a lateral box frame with a slot that a pull tab can fit through I was able to provide the ability to physically adjust the interpupilary distance between lenses by simply pulling or pushing the tabs on either side of the helmet. Although the physical system worked reasonably well, the feedback for users (including myself) to know when the adjustment was correct was missing. I was however able to allow one gentleman with a particularly narrow IPD the ability to see a convincing VR experience without optical distortion for the first time, as this is not currently a feature of the standard Oculus Rift DK1. This single event made the effort in adding this adjustment completely worthwhile for me.
– No face touching: The design incorporates a front back head clamp design that uses an elastic strap and static friction to hold the head mount firmly to the head while the whole weight of the HMD is supported by the top strap (not shown in picture). This provided sufficient rigidity to completely remove the need for a nose bridge or cheek pads which are typically used to prevent axial sway as the head is turned rapidly.
The headset design however does have several drawbacks:
– The head clamp system was only comfortable for a subset of head shape/sizes and was incapable of working on a childs head.
– The headset design did not allow the user to wear a pair of over the ear headphones.
Overall a very succesful iteration and the feedback from the people and VanVR has provided lots of good fuel for future improvements.
In this iteration I explore the concept of a rigid beam going all the way around the head. This allows me to move the weight of the HMD to the top of the head and removes all touch points from the face. Lots of details in the video.
Just wanted to introduce you to my version of Mythbuster’s famous “Buster”. No explosions or dead-falls I am afraid but he is ready, willing and able to act as the head model for some somewhat ungainly and ugly head gear.
One of the key physical requirements of VR headsets of any type is to be able to adapt to the viewer’s individual facial and optical characteristics. A key component of this is interpupillary distance. The distance between pupils varies considerably from individual to individual with the extremes being 52mm and 72mm. As you can see in the image, if you are unlucky enough to be at the extremes of the range shown, you are not going to get a satisfactory experience from a fixed width head mounted display. Oculus attempts to address this by moving the virtual eyeballs to the correct setting when it renders the two images but there is no physical adjustment on the Oculus Devkit 1 and, to the best of my knowledge, the Devkit 2. From what I can see of Sony’s Morpheus, I don’t think you can adjust the lateral distance between the lenses either. I would like to incorporate this feature into my headset designs at some point, and, minimally, set the interpupillary distance for my headsets to be the average by default.
In order to adjust a head mounted display to the correct interpupillary distance, you need to know what your personal distance is. To that end I developed a series of physical measuring devices that culminated in the following prototype. The idea is that you can hold the device in front of your eyes and then slide the two holes closer or further apart until each pupil is centered in the hole as seen reflected in the image of the mirror. Once set the device can be placed on a table and the distance between the two eye holes measured.
It worked well when standing in front of a mirror in good light and was portable, but when I tried to use it on some volunteers it did not so work well. The usage model was unintuitive to them and without a stable mirror it was virtually impossible to align it correctly. I will build a new version that uses long parallel tubes with a light source at the end and perhaps incorporate a measurement scale or Vernier system in my next prototype.
Well, who knew that 3D printers were weather sensitive. We had a cold snap (-1 to -8) and all the cool printing I wanted to do for my Oculus Rift lifted right off the build platform.. ugh. However, we have … Continue reading →
From there I went on a very wide and circuitous path ever deeper into complex software systems, big data etc etc which ultimately led me into the intelligence space where I have been lurking until recently. Now, with the rise of consumer 3D printers, 3D television and the Oculus Rift I have the opportunity to return at last to my passion, which leads me to this blog.
My goal is to share my thoughts and dreams about how the lines are being ever more blurred between the virtual and physical worlds as well as my personal adventures in this arena. Welcome and enjoy 🙂
Welcome to my musings on all things virtual, physical and all gradations in between. I started this journey long ago at the BBC where I pioneered real-time virtual set technology for BBC News (as well as creating all the 3D graphics and “over the shoulder” branded imagery for both BBC News and BBC World Service News, but that’s another story).
This is an image from the first ever BBC News broadcast to use the “glassy” globe and coat of arms I created in 1993.
If you would like to see the full virtual set in all its glory watch http://www.youtube.com/watch?v=h8kGSBN7cu from 30 seconds on. Everything is virtual except the announcer who is being blue screened in using an ADO (old world way of placing real-time video into a 2D texture in a 3D environment).
I also had the a great and fun experience working with Peter Snow and Mike Afford writing all the software for his then famous 3D graphics presentations for the 1992 US election
Sadly I seem to have been written out of history on this, but here is a complete article on my work.