Navigation auf uzh.ch
Augmented Reality (AR) has been accessible to smartphone users for quite a while now, however, it was only with the release of the mobile game Pokémon Go in 2016 that this kind of technology has finally entered the consciousness and the everyday life of millions of people. Even though the number of regular players is in decline already, smartphone applications such as this provide an interesting insight into a recent kind of mobile multimodal and multispatial behaviour. AR apps foster the simultaneous interaction with and the merging of virtual and physical spaces by encouraging the players to move and interact with their actual environment. As the virtually augmented environment is at the same time transformed into another world on screen, players “see the real world, with virtual objects superimposed upon or composited with the real world” (Azuma 1997). This project takes up mobile augmented reality as a research focus and provides, in a first step, a methodological reflection on how to capture the multimodal interaction among players while being collaboratively engaged in the gameplay of Pokémon Go, but also between single or team players and their environment at the boundaries of virtual and physical space. We want to answer the question, how video and audio data can be collected by using various technological tools in order to make multimodal interaction as well as both spaces and the transitions between them observable.
Project leadership: Nathalie Meyer, Christina Brandenberger
Funding source: URPP Language and Space