SAN FRANCISCO—Valve Software’s famous “flat” structure means most of its game makers have vague titles. One of the few exceptions is the Principal Experimental Psychologist, who presented a futuristic game vision at this year’s Game Developers Conference — in particular, he made some quirky confessions about how Valve might one day control your brain activity in the middle of a game. study and what the company can do with it.
Before speaking, Valve Software’s Mike Ambinder put out a very loud disclaimer about GDC’s “vision” trail of panels: “This would be speculative,” he said. “This is one possible direction that things could go.” Even with that caveat in mind, Ambinder’s choice of details is interesting to sink our teeth into, especially coming from a company that seems to offer more speculation about the future of gaming than it does actual applications (ie new games).
The slot machine of your mind?
The above and below images of Ambinder having fun with Valve co-founder Gabe Newell weren’t just for yuks: “Every talk I’ve given is reliably laughed at. Think about that. What if we could elicit reliable responses [from video games] and determine that we did?”
Ambinder compared the use of a controller by a gamer to an average conversation. In its analogy, when you talk to a person, your words and gestures are equivalent to pressing controller buttons: loud and clear signals. But in terms of more subtle cues like facial cues, “video games let us out of the nonverbal part of a conversation,” he said.
As a result, games face interface limitations such as memory, Ambinder argues. A player can remember as many as 20-100 combinations of mouse-and-keyboard commands, he says. That may seem like enough for some games, “but that’s still a limitation.” And the exact amount of time it takes for a synapse to reach a finger can average about 100 milliseconds, he estimates. “What if we could shave that time down to 20-30 milliseconds?” he asked.
Existing hardware, if money were no limit, could allow game makers to track everything from synaptic responses to “galvanic skin response,” from glances to muscle tension and posture. Many of Ambinder’s suggestions for what a gamemaker could do with this data sound like experiments with heart rate sensors, such as adjusting the difficulty based on a player’s feelings at a given moment. We’ve seen Valve (and other studios) talk about the gameplay tweaking potential of eye-tracking in previous years.
But one suggestion in particular raised alarm: customizing virtual goodies in a game on the fly. “We can figure out which kinds of rewards you like and which you don’t,” Ambinder suggested, possibly based on the physiological reactions a player may have when given loot. However, he didn’t talk about the very serious privacy implications of this feedback loop, nor the abuse potential of having a game that pumps players with loot-driven endorphins just when they might get bored. (The mechanics of slot machines and loot boxes are already frowned upon for artificially playing with players’ expectations of hooking them longer.)
“Make You Love the Predator”
Other ideas, on the other hand, sounded positive, if not absolutely insane. For example, this level of connectivity could help blind, deaf and otherwise sensory people perceive a video game in new ways. This would require transcranial magnetic stimulation (TMS), a particularly wild kind of treatment already being used to help sufferers of insomnia, PTSD and other conditions.
“Can we show you in infrared like the Predator? Give you access to spatial location data, or even echolocation? Can we add a sense? Improve your sense of touch? Help you notice new frequencies?” Ambinder continued. “Taste and smell things you’ve never tasted or smelled before? Focus your attention? Stimulate certain parts of your brain to recruit neurons for other tasks? Help you retain more memory at once? Improve memory retrieval?”
Remember that “speculative” caveat from earlier? There you go. That particular hypothetical scenario would require TMS systems, and those are admittedly much more distant as a possibility than the head-mounted, synapse-monitoring systems Newell saw wearing in the photos above.
Of course, the near future capabilities of these tracking systems are also limited by the amount of head and skin interaction required. Such connectivity would require buying a bulky headset as well as players who choose to wear it. But Valve has at least one way to get players to adopt such a crazy scenario. After asking about its feasibility, Ambinder gestured to a photo of himself wearing an HTC Vive VR headset. “Not everyone plays VR, but VR gives you semi-consistent contact with a source of brain activity,” he said. It’s interesting for Ambinder to point out rumors of a Valve-produced VR headset confirmation on behalf of Valve.
Until this kind of system becomes more commonplace, Valve clearly has some practical work to do to either make hardware or wait for more game makers to buy this ultra-connected theory that games would provide an “enhanced, higher-quality player experience if developers had access to states, emotions, cognition, and decisions.” In the meantime, now—not later—maybe the time to start talking about the privacy implications inherent in “games as a service” reading and responding to our brainwaves.