
Warning: Actual visible hologram limited by the initially small field of view of HoloLens.
Microsoft
LOS ANGELES — The upcoming HoloLens “augmented reality” headset — the one that creates images that appear on lifelike surfaces, hologram-style — got perhaps its most impressive public demo yet at Microsoft’s E3 press conference on Monday. The footage of a player walking around a table in the real world shows an apparent 3D representation of a live Minecraft session seemed like a great application for Microsoft’s augmented reality technology.
Later in the week, we got to try that demo for ourselves on prototype HoloLens hardware, along with two other experiences that showcased the hardware’s gaming potential. Those demos highlighted HoloLens’ ability to provide gaming experiences that can’t exactly be matched by other gaming hardware, but also the technical limitations that keep the current version of that hardware from being fully immersive.
Warzone, up close
Our first HoloLens demo at the show came as a surprise demo session Halo 5‘s new “Warzone” multiplayer mode. These kinds of E3 demos are usually preceded by splashy scrolls that describe what to look out for in a never-before-seen game. But Microsoft added some HoloLens flair to this gaming conference trope, leading would-be players into a fake military compound that would soon come to life with floating holograms in the HoloLens crosshairs.
Sam Machkovech
With the headset on, the reality augmentation began. A set of solid dots appeared on a nearby wall, as if they had always been there. Soon we noticed a sharp, virtual arrow pointing to the left on another wall at the end of a hallway, which remained in its apparent place and true as we walked about 20 feet to its location. It was a wonderful demonstration of how HoloLens worked in a large space.
After making the requested spin, we first noticed the same issue Peter Bright of Ars complained about last month after a HoloLens demo: the incredibly limited field of view. The wall we were looking at was enlarged with a virtual window through which we could see a battle taking place. However, we had to step back to make that whole window visible. If we stood too close to the marked spot on the wall, the virtual battle would be cut away by the edges of the small virtual window floating in our view.
The amount of visual space occupied by HoloLens is roughly equivalent to the size of a rectangle made by your thumbs and middle fingers, held about a foot in front of your face. Any added content that wasn’t behind that little window in front of our faces just didn’t show up. That means moving your head a lot or stepping back to get a full view of objects and scenes.
The demo ended with us HoloLens testers standing around a giant table, which soon lit up with a Halo 5 warzone mode briefing featuring 3D models of the giant battlefield we were about to fight, along with models of potential enemies and weapons we would use. We had played one Halo 5 game a few days earlier in this exact level, so we instantly recognized the terrain and buildings, providing a surprisingly magical callback feel.
We could walk around the table, lean in, look from above, and kneel down to watch with the enlarged base, and at no point did the content vibrate in our vision in any significant way. Our tracking has never been lost. Unfortunately, that field-of-view crop easily limited the feeling of being immersed in a virtual/real world hybrid; it was more like looking through a small magical window into another world than in reality being in that world. And while it was an impressive training scene, this demo also didn’t give us much of an idea of how HoloLens would work for actual gameplay.
Minecraft and guns
Later in the conference, we finally got a chance to try out some more direct interactive experiences on HoloLens. The first was a first-person shooting experience called Project X-Ray, which took place in a mostly bare conference room, about ten feet to one side. With a HoloLens unit on, flying insectoid robots seemed to dig through holes in the walls and walk towards us. Later, those same robots would hover in the air and fire slow-moving balls of energy and lasers at our heads, which we could dodge in the real world by ducking and weaving. To fight back, we aimed by simply looking at the robots and firing the trigger on an Xbox controller in our hands.
While the demo experience wasn’t very challenging, all in all it was quite a unique experience. Being able to walk around the room without the constraints of a tethering wire or massive computer tower, being able to rotate freely to see enemies from all sides and angles, is something that can’t be done with any other virtual reality headset we’ve tried. The ability to see both the surrounding room and those enemies also meant we never had to worry about accidentally bumping into a wall or tripping over a stray chair or anything like in virtual reality.
Here, too, the 3D effects of augmented reality were particularly impressive. When a robot broke through a wall, the hole it left behind looked like it was embedded in the real fabric walls of the room, staying in place and fine-tuning its 3D perspective as we moved around (although there was a tiny bit of annoying vibrations). if we moved our head too fast). An in-game X-Ray ability gave us a virtual view through the walls, showing enemies before they pop out, and, in a cute touch, revealing some fake pipes and wiring apparently hiding behind the wall.
Still, the limited field of view of the HoloLens reared its ugly head here. The small augmented reality window meant we couldn’t see the virtual robots unless we were staring directly at them, ready to fire. Our reticle had a series of red arrows showing where to tilt our necks to find nearby enemies, but this seems like a stopgap for finding creatures that should have been easily in our peripheral view, or even just slightly off center . The game design itself seemed limited by this concern; in an apparent concession to the visible realm, robots would hover right in front of our faces before firing, rather than firing from all sides.
After Project X-Ray it was time for the Minecraft demo that we were so impressed with at Microsoft’s press conference. Before we started, we had to look around the room so that HoloLens could get a sense of the 3D space. After bending our necks around for about 30 seconds, a gaze-controlled 3D cursor was able to trace the contours of the room’s walls, chairs, and filing cabinets.
Calibration completed, the demo started playing a plain old 2D game of Minecraft projected on a virtual screen on the wall. On the plus side, I was able to make this screen incredibly large with a few voice commands, creating the equivalent of a fully flat TV about 70 inches on what had been a bare wall. On the other hand, the HoloLens’ limited field of view meant I could only see part of this TV at a time, even when standing about ten feet away. I had to shrink the virtual screen down to an apparent 30 or 40 inches (approximately) before I could see everything at once without moving my head.
Another impressive feature of the HoloLens-powered virtual screen was the ability to activate a three-dimensional image so that the scene seemed to disappear into the wall like a window box. Unlike a standard 3D monitor, this 3D image actually changed perspective based on the viewing angle. If I went to the wall and looked at the screen from the left side, I could see parts of the world that would usually be beyond the right side of the wall, as if the screen were just a window into another world.
But HoloLens isn’t just about screens. I wanted to play with the world on a table like I saw in that demo. And when I said “place world” out loud, I had to do just that, watching the formerly solid table fill with a grotto sinking to the floor and a castle rising to the ceiling. The 3D blocks stayed in the correct perspective and overall position as I moved around the table, and even when I leaned my face in to be “in” the blocks. There were minor issues with positional judder and frame rate when the world was updated, but it was stable enough to feel like it was a somewhat ethereal part of the physical world for the most part.
This was my first chance to try out HoloLens gesture control, which follows your hands in a way similar to Kinect, only from the inside out, rather than the outside in. Those gestures were limited in the demo to what’s called an “air click,” where your thumb and forefinger press together to register a point in space (the gesture is accompanied by a nice clicking sound from the little speakers next to your ears ). From there I could put my fingers together and move my hand to pan and rotate the part of the movie Minecraft world visible above the table. This process generally worked, but with a small but noticeable delay, similar to using gestures on the Kinect. If I moved my hand outside of a relatively small area right in front of my face, the tracking would stop.
After that, my guided experience played a lot on the demo on stage. I could see my companion’s avatar as he walked around the world with an Xbox controller. I was able to guide him by placing signs using voice commands. I could summon lightning bolts with my voice to detonate TNT barrels that exposed the great cavernous underbelly.
Through it all, the limited field of view remained the biggest annoyance, especially when trying to lean forward to get a closer look at the world. However, as a proof of concept, Microsoft’s HoloLens is just as impressive as the first demos we saw of Oculus’ virtual reality technology three years ago, if not more so. Over the years, Oculus slowly solved most of the problems with that blurry, sickening demo. If Microsoft can make the same progress on the issues in HoloLens before it becomes a consumer product, we’ll be just as excited about the potential gaming applications.