Is augmented reality gaming a dead end, or can open, hackable technology bring gaming back outdoors?
Games have their evolutionary origins in learning and practice. For hunting and gathering, war and tribal conflict, it makes sense to have a way to practice techniques, strengthen teamwork and test outcomes without the negative consequences if something goes wrong.
Sports in particular, but even the most basic of games tap into this instinct. As video games get increasingly abstracted away from that root origin, they can lose a lot of their power and meaning, and leave the experience feeling a little shallow.
Computers are tools to be used, whether they are for work or for helping us have fun. So understandably, attempts to augment reality have focused on taking real world situations and using various tools to add to the experience, or taking a digital experience, such as gaming or navigating a UI, and bringing real world elements into the equation.
Both these approaches are fundamentally wrong, because they aim to take something pre-existing, and modify it in an attempt to work with something for which it was not designed for.
Take for example (American) football. Players have microphones and speakers in their helmets to communicate with their coach on the side lines. It works because it doesn’t change the game, and is really just an upgrade of the existing mechanic where players go over to the coach to talk to them between plays.
Transfer that to soccer, and suddenly you have distracted players, ear pieces falling out when players get roughly tacked or head the ball, and in a fast flowing game, the whole thing rapidly breaks down. Moreover, soccer is designed to be more chaotic game, where there is confusion and quick reactions are needed, and you don’t have so much time to think. The technology makes sense on the surface, but really it goes against the underlying design principles of the game.
The key here is to design a game with the user interface as one of the core considerations. Virtual d-pads on touchscreen games that were originally designed for games with other control schemes are a classic example of how you can’t just port games across interfaces without considering the impact it has on playing the game.
So many augmented reality games fail to get that the interface with reality in which the game is being played is not through the camera and screen of a smartphone, it’s the eyes and ears and other senses of the player. The phone is just one element of that UI, and should only be used where appropriate. It goes back to the idea that you’re so immersed in your augmented reality game that you walk into a lamp post. It sounds silly, but it is legitimate poor design, because it puts too much emphasis on forcing the player to view the world through a restricted and artificial view.
A good example of a “reality” game is paintball. The game is played outside, with real paint and real air guns that make a real painful experience when they hit you. Your phone doesn’t replace any of the existing real elements. You don’t replace the gun, paintballs or the player’s eyes with the phone. Instead you add to existing mechanics by having an app that tells you how many paintballs you have left, where the flag capture point is on your map, how many of your teammates are out of the game.
The really exciting stuff comes however when you start to design games from the ground up considering both the environment and the hardware. Laser tag is usually played in a confined indoor space because of the limitations of the laser guns, whilst a wide game like man hunt only works because it can be played across a large field or wooded area in which there is space to run, places to hide etc. Those same balancing decision need to be made with smartphones or google glasses or whatever other hardware is being used. When is it being used, what are the physical logistics for players to use it, what advantages/disadvantages does its use confer, can its proposed use cause it to be damaged easily?
Open source hardware and software like Android, and the rise of crowdsourcing finance for hardware projects, from the OUYA to the pebble watch, gives designers more creative space to play with when considering the equipment games require, and the environments where that equipment can be used.