We are just approaching the final phase of research into a game called Cold Sun. This is still very much in the realms of a prototype, however the initial results both from the point of view of the graphics and the game play are very promising.
The initial idea emerged from a drunken conversation with Canadian developer Jesse Blum and Mudlark founding member Rachel Jacobs while working in Brazil in 2012. We were doing some experiments in a farmhouse in the rainforest outside Rio de Janeiro, the isolation was quite profound, we were 3 hours walk from the nearest road.
We started riffing on – what if you woke up alone in a place like this with the rain coming down hard, as it does a lot in this part of Brazil. A lone survivalist, having to piece together what had happened to the world while they were asleep. This is such a classic trope so beloved of games and movies – the loner who wakes to find themselves in a world changed beyond compare. But what was attractive was not only this classic narrative structure but also how weather could play a vital part. Our time in the farmhouse was hampered daily by the pattern of heavy weather, sometimes it was so heavy it seemed to affect peoples dreams. So we wondered if there was a way we could create a game where real weather data was essentially invading the imaginary universe of a game. We also liked the idea of the player being unsure of who they are in the game. Neither male or female, man or beast, they must survive in order to find out the truth of who they are.
What we have come up with is a mobile game that takes real-time weather data from a player’s location and amplifies it in a game universe where climate change has drastically changed everything. Challenging the player to think adaptively, notice weather changes around them in the real world in order to plan their next move in the game.
Our research has taken us on many journeys into many approaches as to how you might build a game like this. We have gathered research with a number of partners including Lancaster University, Anglia Ruskin University and Candace Howarth a climate scientist currently working with the Department for the Environment. We have been given insight into stark futures of climate change and have factored some of this thinking into our game world. Topically this information is what is informing recent news reports about the future of the British climate.
We have distilled the research into a game that has 2 modes – dream and survival mode. In survival mode the character has awoken to find themselves on the point of death, they must make decisions in what is essentially a text adventure, to stay alive. Barely able to move and in their confusion and weakened state they pass out frequently at this point the game switches mode to dream mode. In this mode the game becomes an infinite runner where they have to dodge enemies and jump across strange spheres. In both modes aspects of the game play whether it is narrative features or game parameters are influenced by the state of the weather outside the players window.
We are releasing the game to a test audience in the next few months, they will test (alongside ratings of levels of fun and engagement) whether they feel a heightened awareness of the real weather as a result of playing the game. Bring on the rain!
So 2014 is here. A time to reflect on some of the research projects we have been doing in the margins of 2013.
In June Mudlark received funding from the TSB Feasibility Study fund to look at tracking user travel behaviour with the view to building game logic around it. This was a bit of parallel research designed to compliment our previous work on Chromaroma by extending the game beyond the constraints of fixed infrastructure in the form of smart cards and RFID terminals in stations and on buses.
We wanted to explore how a user or player could get an enhanced history of their travel in London and beyond by monitoring their movement and transport habits over time.
We were aware of the success of the Moves app and the effortless way it tracks the different modes of travel from walking, cycling, running and transport. It does it with a near miraculous level of accuracy. We wanted to go a step further and see if we could create a more granular level of detail than is provided in the “transport” type of Moves. We believed by developing our own system we could tease out the fine distinction between: train, underground, bus and car.
Working with the School of Electronic Engineering and Computer Science at Queen Mary University we made this happen by utilising and combining sensors and data available on current smartphones – GPS, Accelerometer. We developed an Android App based on a unique algorithm that measures movement patterns and corrects errors in classification through analysis of user movement patterns over small segments of time.
Our feasibility test around East London in December with 5 test subjects using Bus, Train, Underground and Cycle, correctly identified transport mode shift of the user with an overall accuracy of 78.5%. This was an amazing outcome for something that was basically only working with the data being generated by the phone from the testers movements.
We also wanted to augment the sensor algorithm with online look-ups of live transport data. To test this aspect we did quick development that used aspects both of the Moves API and the Transport API (a service which provides user routing on the UK’s transport network’s). We took my Moves data generated on my Iphone and then ran queries on the start and end points of journeys with Transport API’s routing service. This produced remarkably accurate predictions of user journey type down to Bus route or train line.
We ran across some issues with it falsely choosing bus instead of train and vice versa. We discovered accuracy was increased by choosing the fastest journey in any list of possible routes between A and B. This would obviously not always be the case. A user may choose slower journeys and so a future addition would include a method to track likely routes based on a user’s past history of travel and how often they may have travelled that route in the past.
We came to the conclusion that by combining the App with a query of the Transport Api we could reproduce a user’s journey on any transport type in the UK with a high level of accuracy. We hope to explore this more with a future iteration of the app and also integrate some game play in the mix. Watch this space as this develops during 2014.
INDEX: Award is a prestigious biennial design award, set up in 2005 by INDEX: Design To Improve Life to acknowledge and reward companies developing creative solutions to global and local challenges. Spread across five different categories - Body, Home, Work, Play, and Community – previous winners include groundbreaking social housing projects, One Laptop Per Child, an airbag for cyclists and the Tesla roadster.
Mudlark worked tirelessly with to design the three games - Worm Attack, 9 Minutes and Family Values - in a way which would have a genuinely positive effect in India and East Africa, especially amongst women and girls. We are particularly pleased our work, blending social impact content with compelling gameplay, has been recognised in the Play & Learning category.
Downloaded over 25,000 times to date, the Half The Sky Movement games were designed for target audiences on non-smart, low-end, widely-available mobile handsets and built in J2ME – a platform for older “feature phones”, still dominant in the developing world.
Winners of the INDEX: Awards will be announced on 29th August at the home of Hamlet, Kronborg Castle, Helsingør (Elsinore), Denmark before a public exhibition opens across Copenhagen on the 30th August.
We will be sporting our black ties and crossed fingers.
Each year Mudlark hosts Playful; our one-day conference about games, play, design, interaction and behaviour. Each year we give the conference a new look to reflect that year’s theme, keeping things new and exciting.
Last year we focused on DIY, attempting to inspire people to do things themselves, and so the design was very texture heavy—influenced by the photocopied sleeves of punk 7 inches and worn paint. This year, looking at the things we were interested in—the trends that were bubbling up and the work Mudlark have been doing—we soon realised that the nature of things was central.
We settled on “playing with form” as a broad theme, to look at approaches, materials, and using them differently—creatively and playfully. Changing direction from last year, we flipped the design on its head by employing strong colour, geometry and angles peppered with some cutting-edge browser technologies.
Design in its broadest terms is central to everything we do at Mudlark, from designing the interactions and behaviours around our latest project, the Perceptive Radio, to the wordmark laser-etched into its fascia. From the process that takes a user from A to B in the best and most enjoyable way, as in our work for Ravensbourne university, to social impact game design with Half the Sky.
In a way the design for Playful is a very selfish exercise—one that lets me design to my own brief, to try things or techniques that maybe aren’t quite ready for client consumption. It’s a way of flexing my creative muscles and, really, to show off.
Playing with Form
Greg and I set up a Pinterest board to gather and share visual inspiration and to generate a direction. As things began to develop we saw some fun coming out of using illusions. We liked the idea of messing with people almost to the point of annoyance in a fun way.
Some early experiments explored pure typography and layout, whilst followed a more rigid and straightforward style with some animated elements; but overall came across a bit too sedate for my own ‘visuals that fuck with people’ remit. I had to get back on track, shake things up a bit.
I was inspired by Siggi Eggertsson when thinking about visual ways of showing form. Eggertsson uses his own geometric grid to create wonderful designs and in some pieces uses contrast to create form between the segments. Initially I thought this could be a route to explore—using subtle contrast and shadows between shapes or images to show form.
A visual trick I eventually developed—the moiré background pattern—became the showpiece of the final design. Using this method I applied different colour combinations and pattern sizes for different sections of the site. I set some of the text at the same obtuse angles making for a playful and unusual layout.
This year’s design has had a very favourablereactiononTwitter and has made its way onto—signs that it has made an impression on people. We pat ourselves on the back at this knowing we’ve done what we set out to do, but also knowing that we have to top it next year.
At Mudlark we’re in our element when we get to push the envelope with projects. If you have a design problem then we’d love to hear from you.
Playful 2013 will take place at Conway Hall in London on the 25th of October.
Earlier in the year, we”d proposed a “perceptive” experience for the CBBC homepage, taking in data feeds to change how it was presented and incorporating playful elements (eg blowing away clouds if it was a rainy day near you), so Perceptive Media was a natural fit.
Breaking Out, BBC R&D”s original Perceptive Media experiment involved a two-character radioplay – one of whom was acted by a text-to-voice synthesiser – which uses data to adapt the content to the user”s context. eg using location data to fill in dialogue gaps such as weather or place names (“you can do anything, you can go to the Imperial War Museum” – if you”re in Salford/Manchester).
One of the main issues BBC R&D faced when they have tested the Perceptive Media radioplay is that the audiences could not separate the web-based content from the shiny screen – users would be waiting for something to happen on screen, rather than listening to the audio content.
Ian wanted to free the content from the screen, to see how audiences really reacted to Perceptive Media.
The possibilities of networked things and Perceptive Media are vast, opening up content to be ‘remixed’ live or contextualised for an audiences’ environment and context.
Ian”s brief to us was to create a domestic radio-style object that would playback the radioplay and pull in data to adapt it to the user”s context. This would enable audiences to act naturally around the radioplay content, and for BBC R&D to understand more about Perceptive Media and audience behaviour.
We decided that the object itself would provide further data feeds – using analogue electronics and sensors – in addition to the web-based data already being collected.
The Domestic Environment
Influenced by networked objects that fit seamlessly into the domestic environment and serve a natural function – such as Voy”s Ugle, Russell”s Bikemap, the Good Night Lamp and Skrekkøgle”s own radio object, Plugg - we wanted to ensure that the object was a part of the home, not an intervention. The Perceptive Radio is a behaviourally-driven design.
To do this, we spent time role-playing, listening to the radio, and documenting how we behaved around them – as well as what else was going on around the home. There were three key scenarios that most frequently played out:
Pottering and moving about in one room (eg making a cup of tea, walking to and from cupboards)
Pottering and moving in and out of many rooms (eg housework, gardening)
Settled down to properly listen to the radio (eg armchair with a cup of tea and pet)
The first two scenarios point at partial attention – with radio as an accompaniment – whilst the third is a clear, focused attention scenario.
The environmental/behavioural elements that were most frequently documented:
Time of day
Spikes in ambient noise (eg a telephone call, washing machine spin cycle)
Designing The Perceptive Radio
The design challenge was to demonstrate how a networked object could deliver tailored media experiences that are sympathetic to domestic environments, without being disruptive or jarring.
Marrying the contexts with environmental behaviours was the next stage in domestic design:
What would a user want a radio to do when they are moving around a room frequently?
What would a user want a radio to do when someone calls them up?
What would a user want a radio to do when they are settling down to relax?
In order to avoid any unnecessary diversions or wrong routes, we settled on four key design rules:
Behaviour of the audience as control input
The Perceptive Radio is to be designed as an object that is sympathetic to the domestic environment – something that sits comfortably and naturally in a home – and the behaviour of the listener affects the output. Passive, not active, inputs.
No physical interactions
Whilst the box itself has many affordances that could be used as sensor inputs – such as orientation, angle, touch – it was key that the object itself would be secondary. A function, rather than a tool. So, beyond obvious controls (on/off, play, master volume), the user”s behaviour and environment would be the only control inputs.
As the user is the control input, the controls have to be meaningful. It is important that those controls are behaviours that already exist, and not attempt to create a new invisible control system. We focused on ambient sound and natural movement rather than forced gestures.
For the Perceptive Radio to fit comfortably into a domestic environment, the output (effects) from the interactions must be believable. There are many variables that could be mapped to interactions – such as pitch or dialogue pace – that would be jarring to the listening experience, rather than being sympathetic to the audience. All effects must be believable and beneficial to the user.
Four Design Scenarios
1. Pottering and moving about in one room: making a cup of tea.
Change in volume.
In this scenario, the user is moving around the kitchen, boiling the kettle, collecting mugs and milk whilst they listen to the radio. As they move away from the radio, the volume changes so that the user can hear a consistent volume level at all times.
2. Pottering and moving about in one room: large spike in ambient noise.
Altering the depth of the audio playback.
In this scenario, the user is moving around the kitchen and filling up the dishwasher whilst they listen to the radio. As the dishwasher kicks in and there is a large spike in ambient noise, the radio alters the depth of the audio playback to foreground the most important element (eg the actors voices).
3. Settled down to properly listen to the radio: lowered light levels.
Adjusting the EQ levels.
In this scenario, the user is settling down to a cosy evening listening to the radio. They stoke their fire and turn off bright lights in the room. The change in light levels in the room causes the radio to adjust the EQ levels, cutting most of the treble, to make a more relaxing listening experience.
4. Settled down to properly listen to the radio: large spike in ambient noise, movement.
Pause the playback.
In this scenario, the user is settled down to a cosy evening listening to the radio before being interrupted by phone call. The large spike in noise coupled with the user”s movement away from the radio, causes the radio to pause playback of the audio.
The Technical Approach
As an object, the Perceptive Radio is a bit of a trick on the user – it looks dumb, but is pretty powerful. The broad remit for the technical and object-design side was that it needed to be “small enough to fit in a box, fast enough to use Web Audio API” whilst processing data feeds from analogue electronics.
Working with Adrian McEwen from MCQN LTD in Liverpool to successfully put the internet into a thing, our first hope for running Breaking Out was a smartphone working with a Raspberry Pi and Arduino. Unfortunately, the Perceptive Media radioplay does quite a lot of heavy lifting – including Web Audio API and performing real-time object-based processing (putting a load of reverb on the text-to-voice synth) – which requires a good quality browser such as Chrome and a decent strength computer.
This kind of processing power is sadly not available in small things and – after testing Breaking Out on Android/iOS mobile devices, Netbooks, Chromebooks and an array of Raspberry Pi – we decided to build a compact Mini-ITX system (a 7″ PC) to power the Perceptive Radio. Using a Beaglebone board for the analogue electronics hub, we had a slightly larger than desired technical solution with enough power to process Breaking Out and analogue inputs comfortably.
The Radio object itself needed to accommodate a 7″ PC system, sensitive analogue electronics sensors and have capacity to add further sensors/functions later down the line and – crucially – look natural in a home. Early thoughts around the box design included a classic boombox-style affair, a tall Cathedral design and a box.
All had potential, but were never quite right – all either too bulky, odd or anachronistic. We brought Patrick Fenner on board – an “open source engineer” who makes 2D things 3D – to help “make the box”.
Patrick”s experience and speciality in rapid prototyping with lasercut materials took the direction towards a more Digital Radio design, using finger jointed edges, birch plywood and an acrylic fascia:
The addition of chrome suitcase corners and a carry handle are there for extra structure support and to enable Ian to carry the box comfortably on his travels to demonstrate it.