While Calvium has been banging the drum for digital placemaking for the last few years, we’ve been studying the relationship between people, place and technology for much, much longer. Looking back at our earliest publications, it’s reassuring to see that we’ve always been talking about the connection between people, place and technology – even before we were called Calvium.
From the very start of their work at HP Labs, our directors were working to create magic moments at historical sites, blend the physical and the digital, and bring environments to life. In 2006, for instance, the team from HP Labs rolled out Roku’s Reward – a concept for the kind of smart device-driven AR gaming that would reach worldwide fame with Pokémon Go in 2016.
We’ve been getting nostalgic of late, dusting off the archives to explore some of our older projects, and revisit the critical thinking and ideas that have formed the foundations of our digital placemaking practise today. Join us on a trip down memory lane…
Riot! 1831- the design of a location-based audio drama (2004)
Riot! 1831 was a joint commission from Mobile Bristol, HP & Bristol University, telling the story of the reform riots that took place in Queen’s Square.
- You may like: Riot! 1831 is back: What’s new?
As the first GPS-controlled audio drama ever developed, it served as an exploration – a chance to try out a new form of heritage experience. During the three weeks of activity, over 700 people took the opportunity to walk through the Queen Square, Bristol, of 1831. Using iPAQ & GPS trackers in backpacks, they could hear the voices, the flames and the cavalry charge recreated around them. Nowadays, of course, we have this kind of tech in our pockets and take it for granted, but at the time, this was cutting edge. Feedback from the experience told us listening to audio in situ added a level of meaning – and that was the jumping-off point for our later research.
Parallel Worlds – immersion in location-based experiences (2005)
We followed up the Riot! 1831 experience by exploring exactly what we’d done that kept people in the experience for over an hour at a time. Specifically, we found that user immersion was guided and fuelled by the changes in the audio environment, and sometimes broken up when GPS errors or specific pathways made the audio playback choppy.
Specific moments – fleeting, sporadic periods of deep immersion – made for a compelling experience that compensated for these technical difficulties. Powerful new sounds, familiar names and places, accents people recognised and perceptible physical objects all created those moments we were looking for…
Skip forward to 2018 and these types of ‘immersive experiences’ dominate our culture. We’re living in an experience economy, where people would rather spend their money on activities that will provide them with stories and memories than physical ‘stuff.’ The quest for immersion has even transcended arts and culture, with sectors such as retail looking to adopt experiential approaches to try and drive more people onto the high street. We didn’t invent immersion, but with Riot! 1831 we were on the cutting edge of storytelling, of experiential ‘happenings’, and of the shift in what people were becoming to expect from their leisure activities.
Magic Moments in Situated Mediascapes (2005)
The moments – the hooks into a simulated past that generated true, deep and lasting immersion – became the bedrock of our new framework for experience design. We found six triggers for these magic moments:
- moving through a sea of voices
- ‘collisions’ where the physical and augmented realities aligned
- synaesthetic confusion when those realities didn’t align
- new contexts for familiar environments
- specific routes people could follow through the space
- social bonding – participants could share moments if it was obvious that other people were also participating
Creating these ‘magic moments’ has been the practical goal for all our heritage designs ever since, and we can find all the triggers in successful AR apps across other domains. Night Sky turns collisions and context into a learning experience – you point your phone camera at the sky and the app turns your view into a personal planetarium. GIPHY World hinges around social bonding, creating 3D spaces and scenes for other users to explore, while AR Runner provides specific routes and competitive goals for exercise in the space around its users.
Design for emergence – experiments with a mixed reality urban playground game (2007)
CitiTag was the testbed for our ideas about creative, social, interactive app-based play. We designed a simple experience, putting the bulk of our focus into the surrounding environment and people rather than the technology itself. Play would emerge from a few straightforward high level rules, rather than be dictated by a complex tech-led design.
This idea of emergence from play is at the heart of 2012’s Block By Block – the partnership between Minecraft developers Mojang and the UN Programme for Sustainable Cities. Block By Block turns Minecraft into an affordable, accessible planning tool, allowing architects and environment builders to welcome community participants into the urban design process. Gamification techniques have since become far more widely adopted too, and can be found in everything from social media to advertising.
Walking the GPS Line – insights into the use of shape walking as a game mechanic (2008)
The iPhone was launched in 2007, and as smartphones became more common in the consumer market, we were able to move our focus away from the hardware that made these interactive experiences possible, and onto apps that experimented more with game mechanics and user experience. With GPS and WiFi tech concentrated onto one device it became easier to use positioning as a gameable mechanic, rather than a passive trigger – people could walk around a physical space with the tools in their pocket to augment it with digital experiences. For example, we looked into iMagick – a mixed-reality role-playing game where players activate spells by tracing a shape in the real world, walking the entire length of a route projected through their devices. That was just the beginning.
Ten years later, GPS has become one of the most powerful tools in the app developer’s toolkit, with applications that extend far beyond games. A revolutionary navigation app called what3words takes the GPS-powered ‘satnav’ concept beyond Google Maps and road atlases, dividing the entire world into 5 trillion tiny squares, each with a unique three word address. What’s so great about that? It can find locations anywhere – including streetless addresses in developing nations, where residents are otherwise off the grid, impossible for emergency services or state support networks to find. Those early experiments have yielded powerful results.
Design for coincidence – incorporating real world artifacts in location-based games (2008)
Our next project tied digital mediascapes more closely to specific locations, artifacts and events in the physical world. We framed three different kinds of coincidence – natural, social and feigned – which could be used to create our ‘magic moments’, and looked at how mobile designers could predict or create these coincidences.
The games we studied allowed players to map in-game locations to physical ones – making the most of features in their immediate environment, creating coincidence socially – or referred to common elements that designers could expect players to find. By knowing the environment intimately, designers could predict natural coincidences and build the game app toward what was probable.
In the heritage space, we were able to draw on historical events to inform our choice of places and things to build into our games – notably in an early build of our game app for the Tower of London, where you could help worthy prisoners escape from the tower during your visit.
Priming, sense-making and help – analysis of player behaviour in an immersive theatrical experience (2010)
As the baseline technology of powerful smartphones, reliable GPS and widespread connectivity became ubiquitous, we were able to focus on user experience: how people interacted with pervasive computing. Evaluating Punchdrunk’s ‘Last Will’ interactive theatre experience, on which our HP Labs team partnered, we found three key perspectives for understanding how users engaged with the game.
Priming was important – how the users discover what they can do and how to do it, and what designers can build in to avoid frustration and provide guidance. Making sense of the experience mattered – we needed to understand how people assembled the visual, auditory and haptic stimuli we provided. Finally, we needed to provide contextual help – a way of guiding players through the experience to maintain flow, without breaking the experience.
The more we learned about UX, the more seamless our designed experiences became, integrating digital tech more naturally and smoothly with physical environments. Mobile phones and the theatre are not a natural pairing, but apps can extend the theatrical experience, delivering archive content, festival programming, learn-as-you-go script glossaries and more.
As digital placemakers, we’ve come to understand that easy-to-use tech is absolutely vital. Bringing the people who use and know a space into our projects is a cornerstone of our practice: for that to work, the tech needs to be self-explanatory and straightforward, welcoming them into the process rather than putting them off.
We’ve been working with people, place and tech for a long time. Over the years we’ve built a knowledge base and refined that thinking in media, UX and tech: we’ve progressed frameworks that have all worked up to the digital placemaking prowess that we have today.
