So 2014 is here. A time to reflect on some of the research projects we have been doing in the margins of 2013.
In June Mudlark received funding from the TSB Feasibility Study fund to look at tracking user travel behaviour with the view to building game logic around it. This was a bit of parallel research designed to compliment our previous work on Chromaroma by extending the game beyond the constraints of fixed infrastructure in the form of smart cards and RFID terminals in stations and on buses.
We wanted to explore how a user or player could get an enhanced history of their travel in London and beyond by monitoring their movement and transport habits over time.
We were aware of the success of the Moves app and the effortless way it tracks the different modes of travel from walking, cycling, running and transport. It does it with a near miraculous level of accuracy. We wanted to go a step further and see if we could create a more granular level of detail than is provided in the “transport” type of Moves. We believed by developing our own system we could tease out the fine distinction between: train, underground, bus and car.
Working with the School of Electronic Engineering and Computer Science at Queen Mary University we made this happen by utilising and combining sensors and data available on current smartphones – GPS, Accelerometer. We developed an Android App based on a unique algorithm that measures movement patterns and corrects errors in classification through analysis of user movement patterns over small segments of time.
Our feasibility test around East London in December with 5 test subjects using Bus, Train, Underground and Cycle, correctly identified transport mode shift of the user with an overall accuracy of 78.5%. This was an amazing outcome for something that was basically only working with the data being generated by the phone from the testers movements.
We also wanted to augment the sensor algorithm with online look-ups of live transport data. To test this aspect we did quick development that used aspects both of the Moves API and the Transport API (a service which provides user routing on the UK’s transport network’s). We took my Moves data generated on my Iphone and then ran queries on the start and end points of journeys with Transport API’s routing service. This produced remarkably accurate predictions of user journey type down to Bus route or train line.
We ran across some issues with it falsely choosing bus instead of train and vice versa. We discovered accuracy was increased by choosing the fastest journey in any list of possible routes between A and B. This would obviously not always be the case. A user may choose slower journeys and so a future addition would include a method to track likely routes based on a user’s past history of travel and how often they may have travelled that route in the past.
We came to the conclusion that by combining the App with a query of the Transport Api we could reproduce a user’s journey on any transport type in the UK with a high level of accuracy. We hope to explore this more with a future iteration of the app and also integrate some game play in the mix. Watch this space as this develops during 2014.
Earlier in the year, we”d proposed a “perceptive” experience for the CBBC homepage, taking in data feeds to change how it was presented and incorporating playful elements (eg blowing away clouds if it was a rainy day near you), so Perceptive Media was a natural fit.
Breaking Out, BBC R&D”s original Perceptive Media experiment involved a two-character radioplay – one of whom was acted by a text-to-voice synthesiser – which uses data to adapt the content to the user”s context. eg using location data to fill in dialogue gaps such as weather or place names (“you can do anything, you can go to the Imperial War Museum” – if you”re in Salford/Manchester).
One of the main issues BBC R&D faced when they have tested the Perceptive Media radioplay is that the audiences could not separate the web-based content from the shiny screen – users would be waiting for something to happen on screen, rather than listening to the audio content.
Ian wanted to free the content from the screen, to see how audiences really reacted to Perceptive Media.
The possibilities of networked things and Perceptive Media are vast, opening up content to be ‘remixed’ live or contextualised for an audiences’ environment and context.
Ian”s brief to us was to create a domestic radio-style object that would playback the radioplay and pull in data to adapt it to the user”s context. This would enable audiences to act naturally around the radioplay content, and for BBC R&D to understand more about Perceptive Media and audience behaviour.
We decided that the object itself would provide further data feeds – using analogue electronics and sensors – in addition to the web-based data already being collected.
The Domestic Environment
Influenced by networked objects that fit seamlessly into the domestic environment and serve a natural function – such as Voy”s Ugle, Russell”s Bikemap, the Good Night Lamp and Skrekkøgle”s own radio object, Plugg - we wanted to ensure that the object was a part of the home, not an intervention. The Perceptive Radio is a behaviourally-driven design.
To do this, we spent time role-playing, listening to the radio, and documenting how we behaved around them – as well as what else was going on around the home. There were three key scenarios that most frequently played out:
Pottering and moving about in one room (eg making a cup of tea, walking to and from cupboards)
Pottering and moving in and out of many rooms (eg housework, gardening)
Settled down to properly listen to the radio (eg armchair with a cup of tea and pet)
The first two scenarios point at partial attention – with radio as an accompaniment – whilst the third is a clear, focused attention scenario.
The environmental/behavioural elements that were most frequently documented:
Time of day
Spikes in ambient noise (eg a telephone call, washing machine spin cycle)
Designing The Perceptive Radio
The design challenge was to demonstrate how a networked object could deliver tailored media experiences that are sympathetic to domestic environments, without being disruptive or jarring.
Marrying the contexts with environmental behaviours was the next stage in domestic design:
What would a user want a radio to do when they are moving around a room frequently?
What would a user want a radio to do when someone calls them up?
What would a user want a radio to do when they are settling down to relax?
In order to avoid any unnecessary diversions or wrong routes, we settled on four key design rules:
Behaviour of the audience as control input
The Perceptive Radio is to be designed as an object that is sympathetic to the domestic environment – something that sits comfortably and naturally in a home – and the behaviour of the listener affects the output. Passive, not active, inputs.
No physical interactions
Whilst the box itself has many affordances that could be used as sensor inputs – such as orientation, angle, touch – it was key that the object itself would be secondary. A function, rather than a tool. So, beyond obvious controls (on/off, play, master volume), the user”s behaviour and environment would be the only control inputs.
As the user is the control input, the controls have to be meaningful. It is important that those controls are behaviours that already exist, and not attempt to create a new invisible control system. We focused on ambient sound and natural movement rather than forced gestures.
For the Perceptive Radio to fit comfortably into a domestic environment, the output (effects) from the interactions must be believable. There are many variables that could be mapped to interactions – such as pitch or dialogue pace – that would be jarring to the listening experience, rather than being sympathetic to the audience. All effects must be believable and beneficial to the user.
Four Design Scenarios
1. Pottering and moving about in one room: making a cup of tea.
Change in volume.
In this scenario, the user is moving around the kitchen, boiling the kettle, collecting mugs and milk whilst they listen to the radio. As they move away from the radio, the volume changes so that the user can hear a consistent volume level at all times.
2. Pottering and moving about in one room: large spike in ambient noise.
Altering the depth of the audio playback.
In this scenario, the user is moving around the kitchen and filling up the dishwasher whilst they listen to the radio. As the dishwasher kicks in and there is a large spike in ambient noise, the radio alters the depth of the audio playback to foreground the most important element (eg the actors voices).
3. Settled down to properly listen to the radio: lowered light levels.
Adjusting the EQ levels.
In this scenario, the user is settling down to a cosy evening listening to the radio. They stoke their fire and turn off bright lights in the room. The change in light levels in the room causes the radio to adjust the EQ levels, cutting most of the treble, to make a more relaxing listening experience.
4. Settled down to properly listen to the radio: large spike in ambient noise, movement.
Pause the playback.
In this scenario, the user is settled down to a cosy evening listening to the radio before being interrupted by phone call. The large spike in noise coupled with the user”s movement away from the radio, causes the radio to pause playback of the audio.
The Technical Approach
As an object, the Perceptive Radio is a bit of a trick on the user – it looks dumb, but is pretty powerful. The broad remit for the technical and object-design side was that it needed to be “small enough to fit in a box, fast enough to use Web Audio API” whilst processing data feeds from analogue electronics.
Working with Adrian McEwen from MCQN LTD in Liverpool to successfully put the internet into a thing, our first hope for running Breaking Out was a smartphone working with a Raspberry Pi and Arduino. Unfortunately, the Perceptive Media radioplay does quite a lot of heavy lifting – including Web Audio API and performing real-time object-based processing (putting a load of reverb on the text-to-voice synth) – which requires a good quality browser such as Chrome and a decent strength computer.
This kind of processing power is sadly not available in small things and – after testing Breaking Out on Android/iOS mobile devices, Netbooks, Chromebooks and an array of Raspberry Pi – we decided to build a compact Mini-ITX system (a 7″ PC) to power the Perceptive Radio. Using a Beaglebone board for the analogue electronics hub, we had a slightly larger than desired technical solution with enough power to process Breaking Out and analogue inputs comfortably.
The Radio object itself needed to accommodate a 7″ PC system, sensitive analogue electronics sensors and have capacity to add further sensors/functions later down the line and – crucially – look natural in a home. Early thoughts around the box design included a classic boombox-style affair, a tall Cathedral design and a box.
All had potential, but were never quite right – all either too bulky, odd or anachronistic. We brought Patrick Fenner on board – an “open source engineer” who makes 2D things 3D – to help “make the box”.
Patrick”s experience and speciality in rapid prototyping with lasercut materials took the direction towards a more Digital Radio design, using finger jointed edges, birch plywood and an acrylic fascia:
The addition of chrome suitcase corners and a carry handle are there for extra structure support and to enable Ian to carry the box comfortably on his travels to demonstrate it.
Our Creative Director Matt had an open studio event at his artist studio.
He demonstrated an experimental work of sculptural shapes that inflate and deflate depending on the sound levels in the room. It went down well with the kids who visited the exhibition. We’ve never seen so many children shouting at inanimate objects.
On a particularly hot summers day in July of last year Active Ingredient (Mudlark’s sister Arts company) piloted a new game that we are working on – Exploding Places.
It uses location and time as a means to drive gameplay. We were asked initially to develop a game or experience that made it possible for people to experience the history and culture of a place in a new and innovative way.
Stream Arts an organisation based in Greenwich, with a successful history of exploring new uses of technology to engage local communities in arts based activities, asked us to create something that would engage with people in Woolwich
We wanted to create something about the locations of Woolwich as it has many interesting historical features. The Arsenal (both the football team and the armoury) was based there and so is a large military barracks. So it has a long association with the soft and hard machines of war (soldiers and weapons). However we didn’t want to create a predictable tour guide experience that utilised history. Yet another GPS tour guide application Instead we wanted the player to game history in a multiplayer environment.
We took a hundred years of Woolwich history and placed key events on a timeline. We decided the game would happen for an hour. Following this logic, a hundred years would play out over the hour – as if hurtling through time.
The player had free reign to roam around Woolwich town for the hour with an Android phone. Some events were global and experienced by every player, the First World War for instance, however other experiences were assigned to key locations around Woolwich so only players physically present in that location would experience them. To make it interesting players create communities which evolve based on events in history but also as result of their proximity to other player’s communities. Essentially it is like a simplified version of Sid Meyers “Civilisation” but played in real space and time. Players could literally move their communities to areas of Woolwich that were safer or where they have a better chance of growing in numbers. The basic premise of the game is to have the largest community after an hour of play and a 100 years of history.
In order to work out the mechanics of the game we had to do some quite exhaustive playtesting of the game to make sure it worked in the way we intended. Lots of parameters were worked out on paper before we could take them into the game engine.
In our pilot we also used the big screen in Woolwich Square to show the evolving game board. Which created an interesting embodiment of the game into the actual streets of Woolwich itself.
The pilot highlighted the public engagement with the work as we had people ageing from 10 to 70 playing and there was strong evidence that the initial concept was understood and exciting for those involved. The User Interface is still fairly rudimentary and there are still issues with connectivity and system stability but as a proof of concept it was definitely a success.
Pins on a board.
Expect to see a future development of the concept soon. Designed for a wider locale and more people.
Just back from India, where I”ve been on the discovery phase of Mudlark”s latest project, designing four mobile social impact games as part of the Half the Sky multi-platform project.
Half the Sky is already a best-selling book by Nick Kristof and Sheryl Wudunn, that tells stories of women in the developing world, stories that are by turns shocking and inspiring as they experience terrible deprivation and brutality and battle against them. The book suggests ways forward partly through these examples and also by laying out a series of actions and campaigns – political, moral, educational, medical, economic – that can not only improve the lot of females in these societies but the societies themselves.
Half The Sky is now extending out via a PBS special next year , a social action game on Facebook and a lot of work with NGOs, including the mobile games we are designing for , the executive producers of the Half The Sky transmedia material.
I spent a week in and around Delhi with Games For Change and Indian developers and distributors ZMQ, meeting, talking and – most vitally – listening with and to a variety of NGOs, as well as visiting various communities at whom the games might be aimed.
An Indian woman user her mobile phone.
We are designing for the very “established” Java, J2ME platform, because those are the sorts of phones you will find in these communities. Typically , there”s one mobile per family and it stays in the house , like a landline – in effect it is the landline. But everyone has some access to the key piece of technology and most of them are already playing games on their phones.
The constraints of the platform are also the challenges. The same goes for the issues we want to express and “play” in the games – both during the trip and since my return we have been brainstorming ideas for games that will help pre-natal health, get parents to let their female children stay in school after the age of ten, improve girls” health so they can stay in education themselves, confront domestic abuse and even suggest the experience of enforced prostitution.
Children using a mobile phone.
We are looking at twitch, puzzlers, platformers, tower defence, simple simulations… We want great gameplay on small screen that doesn”t require a lot of text and gets the player to think and learn, but also, most importantly, engage. Although the subjects are clearly pretty serious, we never want the games to be called “Serious Games”. We plan to blog about the project as it develops so watch this space.