-
Niantic says this was the very first piece of concept art for pokemon go.
Sam Machkovech
-
The first prototype of the game’s map interface, with GBA-style icons and more “grassy” zones in cities.
-
Day and night cycles were worked on, but eventually scaled back for the launched game.
-
Day/night data was originally more pronounced in non-AR battle scenes.
-
This “hologram” version of a player avatar was eventually dropped.
-
This thumb heat map was used internally to determine how menus would work for one-handed smartphone players.
SAN FRANCISCO – The Game Developers Conference has always offered special panels and stages for the biggest mobile games of the moment, so it was no surprise that pokemon go, which didn’t even exist last March, made its appearance. The game’s only panel at this year’s event may not have been ripe with all-encompassing and behind-the-scenes glimpses of all facets, but it still proved to be a solid look at the early stages of the phenomenon.
Niantic Visual Design Director Dennis Hwang was on hand to recall the successes of the developer’s earlier augmented reality game, Enter. That included thousands of user meetings hosted by players themselves, along with users taking GPS-tracked walks to Enter points of interest (and say they’ve lost weight from all that walking). “Our goals were modest at first,” Hwang said. “If [our players] if we had just walked a few extra blocks, we would have been totally happy because we knew how hard it was to motivate someone to get off a chair and move because of an app or game.”
When it came time to apply Ingress’s real-world systems to a pokemon game, Niantic had a few goals: to make every player a hero, to increase the appeal of the game to more ages and types of players, and to overcome the problem of cognitive dissonance. Meaning: Build a gaming experience that makes players feel like a Pokémon trainer without having to look at their screen, then back to the real world, feeling the two are too strangely apart.
This first manifested itself in pokemon go‘s development because Niantic feared that average smartphones would struggle to display complicated 3D scenes. How would the game work so that it seemed like players were capturing a monster in a real location, even on low-end phones? The first idea was to use 360-degree ‘photoospheres’, which would appear based on where the player was. However, Hwang said it’s difficult to accurately and consistently represent characters in these photospheres, which often appear too large or too small compared to the scene, especially since these photospheres don’t contain scaling metadata and were often generated from user-generated images. submitted images . (Niantic says Enter fed the company about 10 million locations to use pokemon go.)
Hwang didn’t describe exactly how the developer created his augmented-reality, camera-powered, “real-world monster” system, but he did show the very first prototype of its use: in a developer’s backyard, capturing of a pikachu. Even at that early stage, players were expected to use a single finger to swipe a Pokéball at an occasional evasive monster in the real world. At that early stage, however, sprites from the Game Boy Advance games of the series were used in lists and menus, as opposed to icons of the game’s 3D-rendered characters.
The art team then turned their attention to the game’s map, which translates real-world map data into a Pokémon-like overworld. Originally, the design process leaned toward greener and more lavish designs by default, in order to more closely resemble the artistic style of the series’ biggest wearable hits. However, cognitive dissonance quickly emerged in real-world playtesting. Hwang pointed out that the nearby intersection of Fourth and Folson in San Francisco, which is mostly made up of pavement and buildings, would look weird “painted with lush vegetation.”
Lighting Engine: Taking out two Pidgeys with one Onyx
-
Lighting advice for pokemon go.
-
Internal documents about how different characters would look compared to each other.
-
Detailed look at shaders and other numerical development data.
For similar reasons, previous “avatar” characters that looked similar to characters from Game Boy Advance games were scrapped in favor of a humanoid style unique to Niantic’s version of the Pokéverse. Hwang didn’t connect some of the design points about humans with monsters, but the biggest thing they have in common is a lighting engine that favors “amorphous, even lighting.” This had the dual benefit of making characters appear on any background generated by smartphone cameras, and also requiring fewer shader passes to reduce GPU overhead on low-power smartphones.
Before his conclusion, Hwang started hinting at features and updates coming to the app (although the panel talked very little about the game’s mechanics). He had two key hints: features that would align with mass gatherings of people in the real world (as in, more than 1,000 To go players in the same place in the real world) and “continue to build a real connection with the player” in the form of day/night effects and weather effects in the game experience.
“There were so many ideas that we wanted to build that we didn’t have time for,” Hwang said. “The version of To go which eventually went to the public contained 10 percent of the ideas Niantic development members had.”
List image by Sam Machkovech