Archive: Future

Designing Cold Sun

An exploration into a strange future; narrative, weather and consciousness.

Cold Sun is a dual-mode adventure game that is affected by real-time weather. In the game you flip between your Existence which is set in the future, and your Dream world which is a one-touch platform game where you must navigate a magnified environment—generated by the real weather data. Matt describes the state of this research project so far.

Some themes we explored for this stage one prototype included:

· Coils. Mortal coils, electric coils, etc.
· Circles as a general theme. Sun, timers, time running out, the globe.
· Colours and it’s variation in weather patterns.
· Dreaminess. Blurry. Disorientation, spirals.

After poring over images we pulled together using Pinterest (a really useful collaborative moodboarding tool we love to use) we started to generate some of our own visuals.

Spiral experimentations.

Wind denotations.

A cold sun.

We then started to explore how using these visual ideas could work as a title for the game.

A playful option for Cold Sun.

Cold Sun using ‘weather front’ sign.

We arrived at a really atmospheric graphic that we could use for a title screen, some Dream mode disorientating circular gameplay, and more.

Final Cold Sun title screen.

Dream mode gameplay.

Mudlark creates games, from small mobile and desktop games through to real-world experiences based on your heartbeat or games the size of London. Game design and playful thinking are central to Mudlark’s creative processes. Take a look a other games we have made.

Going Off the Rails

So 2014 is here. A time to reflect on some of the research projects we have been doing in the margins of 2013.

In June Mudlark received funding from the TSB Feasibility Study fund to look at tracking user travel behaviour with the view to building game logic around it. This was a bit of parallel research designed to compliment our previous work on Chromaroma by extending the game beyond the constraints of fixed infrastructure in the form of smart cards and RFID terminals in stations and on buses.

We wanted to explore how a user or player could get an enhanced history of their travel in London and beyond by monitoring their movement and transport habits over time.

We were aware of the success of the Moves app and the effortless way it tracks the different modes of travel from walking, cycling, running and transport. It does it with a near miraculous level of accuracy. We wanted to go a step further and see if we could create a more granular level of detail than is provided in the “transport” type of Moves. We believed by developing our own system we could tease out the fine distinction between: train, underground, bus and car.

Working with the School of Electronic Engineering and Computer Science at Queen Mary University we made this happen by utilising and combining sensors and data available on current smartphones – GPS, Accelerometer. We developed an Android App based on a unique algorithm that measures movement patterns and corrects errors in classification through analysis of user movement patterns over small segments of time.


User Interface of Test Android App

Our feasibility test around East London in December with 5 test subjects using Bus, Train, Underground and Cycle, correctly identified transport mode shift of the user with an overall accuracy of 78.5%. This was an amazing outcome for something that was basically only working with the data being generated by the phone from the testers movements.

We also wanted to augment the sensor algorithm with online look-ups of live transport data. To test this aspect we did quick development that used aspects both of the Moves API and the Transport API (a service which provides user routing on the UK’s transport network’s). We took my Moves data generated on my Iphone and then ran queries on the start and end points of journeys with Transport API’s routing service. This produced remarkably accurate predictions of user journey type down to Bus route or train line.
We ran across some issues with it falsely choosing bus instead of train and vice versa. We discovered accuracy was increased by choosing the fastest journey in any list of possible routes between A and B. This would obviously not always be the case. A user may choose slower journeys and so a future addition would include a method to track likely routes based on a user’s past history of travel and how often they may have travelled that route in the past.


Interface showing “Moves” journeys and the journey type predictions

We came to the conclusion that by combining the App with a query of the Transport Api we could reproduce a user’s journey on any transport type in the UK with a high level of accuracy. We hope to explore this more with a future iteration of the app and also integrate some game play in the mix. Watch this space as this develops during 2014.

Month Links: February 2013

Hello, March – it”s really good to see you. It”s been a while. It feels like I hardly got to know February, before it spirited away for another year.

Still, in those brief twenty-eight days, we kicked off three new pieces of work as well as continued work on a good half-dozen live projects. Good, busy days with some exciting things on the go. In the quieter moments, here”s a bunch of things we found interesting and provided happy diversions.

Olafur Eliasson, Model For A Timeless Garden.

- The Light Show at Hayward Gallery is a marvel, and really worth sticking your head in. Even if just for the Eliasson piece above.

- One of the best things so far this year is one of the smallest and simplest: Jargone. Jargone is a bookmarklet that scans websites for jargon language and suggests common, day-to-day alternatives. It”s made by Roo Reynolds and is an excellent by-product of the dedication to simple, clear, quality work being done within Government Digital Service.

- Continuing the “doing simple well” thread, James has gone back to Twitter”s post from last summer about their process in overhauling their mobile site. It”s easy for us to advocate mobile first practices, but how do you go about that when you have half a billion users and thousands of devices to serve across the world?

- We”ve been enjoying the open epistolography of Hubbub”s Recess! project – a published discourse around games between Kars, Alper and Niels.

Asshole Mario 3, Stage 1.

- Die Gute Fabrik”s Doug did a “best games of 2012″ end of year post. Normally you”d expect a top ten of indie, AAA and folk games, but Doug”s list is a brilliantly of 2012 – specific moments of play that stuck out. From a trampoline-controlled mod of Proteus to competitive yoga and the Hokra “world championships”. All incredibly exciting and envy-creating.

- Our thoughts have started about 2013: what it is, what it will look like, who we”d like to speak, things we”d like to hear more about. It”s an exciting bit of the project, the first flushes of romance before the realisation that oh god 400-odd people are expecting a good time. As ever, we”ll be looking for interesting ideas and cold hard cash for sponsorship – so get in touch if you have either of those things.

- A few times I”ve caught James making some odd movements in the corner of my eye; he has been playing with the Responsive Typography demo by Marko Dugonjić. It”s an interesting project, and feels like it touches the ideas about Perceptive Media, not just a straight up “responsive” approach.

- In other face-tracking news, the brilliant Henry Cooke has created Faces In The Cloud – a thing mixing computer vision and humans” tendencies to pareidolia.

Sruli Recht A/W 13, Runway Presentation.

It”s been an excellent month for apocalypse fans, the best since December. I read a very cold, but beautiful, collection of graphic short stories recently published by Fantagraphics, Beta Testing The Apocalypse. It”s part Ballard, part design fiction, part straight up comics. Never seen architecture used so well as a character in comics.

- Channel 4 have put out plenty of paranoid drama lately, in the form of Utopia (eugenicists, preppers & conspiracy theorists) and Black Mirror“s pop-apocalypse of glowing rectangles.

- Utopia led me to this excellent article in the NYT about TEOTWAWKI (“the end of the world as we know it”) and the prepper scene in New York. Particularly interesting in the post-Sandy context.

- Black Mirror (for all of its many, many, many failings) has provoked a few discussions in the studio. One of particular interest is its approach to interaction design, which seems at times insightful (who doesn”t want the curved digital drawing board?) and sloppy (the mixed metaphors of tactile and gestural interactions clearly come from a Surface Tablet user).

- Black Mirror is interesting in terms of how non-designers are designing interactions that are eventually adopted. That has seen us revisiting the excellent post by Einar about wifi in Sherlock, an interesting read about how Minority Report has locked people into bad IxD, obligatory Dan Hill post about world building, as well as this wonderful blog looking at HUDS and GUIs in film/games. All of which is very helpful for unnamed project #2.

AND FINALLY

To advance the cause of the world, Al Gore wants you spam climate change deniers.

- Things that have been in our ears: , , .

- A great reason to get a Little Printer from BERG.

- A blogging platform designed for transience.

- Richard took the black, in his “finest” Sean Bean accent.

- An archive for a classic of Modernist design, Vignelli”s NYC TA Standards Manual.

- A “mood data sculpture” that waters (or not) a Rose of Jericho based on scraped feelings.

- Machine manuals out emo tumblr users.

- A nice bit of IoT that processes a lot of complex data to let you know the best route to work.

- Russell”s back in café”s, revisiting some of his greatest hits.

Secret Robot House: a trip into the heart of the uncanny valley.

Last Wednesday, I attended an evening at the ‘Secret Robot House’. I’m not sure what images that conjures up for you — an underground laboratory? White coats and flying sparks? Heath Robinson‘s spare room? A house made of robots, for robots, by robots? I wasn’t really sure what to expect. It was all in the spirit of the thing to not really know.

“A minibus will collect you from Hatfield station at 6:30 and escort you to the Secret Robot House”, was the only information from Alex (Internet of Things luminary and organiser of the evening). The minibus took us through a comfortable suburban estate and pulled up outside a residential house. A regular-looking semi-detached in a sleepy middle-class area of Hatfield. Alex waved us in and we entered.

Robot 1
Robot 1

 

It’s a real house, but not a real house. The first things I notice are cut-out spacial markers on the ceiling, dozens throughout the room. A desk covered in computers. Then a pair of robots by the bookshelf. One looks like an oversized Nespresso maker, the other Wall-E’s slightly junkier sibling. Everyone shuffles in an amused, but slightly confused, way. It’s a real house, but not a real house. It’s a robot’s home, a research space for domestic robots to understand human-robot relationships. We’re walking in the uncanny valley.

Nicolas Nova presented a short-talk that took us through the role of sci-fi in developing robots; robots being fictions from the very beginning, that people want to be real. We keep trying to make science fact out of science fiction. Nova talks so simply that it’s easy to imagine he’s not saying anything at all, but – and this is main the point of the whole evening – demonstrated how hard it is to design robots as robots. They will always be an expression of the designers’ beliefs, culture and understanding of others. Nova led us around the various ways in which people respond to robots and the ways that people don’t want to interact with robots (if they are humanoid, or animal-like; slide 11). It was an interesting wander over the developing rules of engagement of Human-Robot Interaction, what we are willing to interact with and what we are rationally uncomfortable about. Too human and we fear Nexus 6 Replicants; too much personality and we fear HAL 9000. And nobody really wants an internet fridge.

In response to Nova, Patrick Bergel suggested that robotics research owes more to old magic than it does sci-fi. He pointed towards the Sorcerer’s Apprentice as the defining example, an original manifestation of the desire for self-automated helpers to do our work for us. The brooms are robots in disguise. It’s true, inasmuch as the Golem is a clay robot, but it’s a little sophistic. We generally try to understand our futures by extrapolating upon the things that exist in our present. Magic and sci-fi are the same, except that magic is now electronic and cybernetic. Perhaps, if Walt Disney had been born thirty years later, the brooms would have been bots with screens.

Prof. Kerstin Dautenhahn (founder of the Adaptive Systems Group responsible for the robotics research in the house) gave us a deeper backstory to the research and the role of the house. She offered a better understanding of the limitations of robots and the perennial challenges that face the researchers. The research was about developing robotic companions — for the elderly, disabled or house-bound — to support and complete simple tasks. It is less about having fully functioning automatons that do everything, without input. Their challenge is to design and build Companions that are ‘useful’ and ‘social’, but also adhere to current and future standards of robotics.

An interesting point Dautenhahn made was about robotics (particularly the robotics research they are doing) being a synthetic science. It is a mixture of engineering, psychology and sociology. The robots aren’t a species, and are inconsistent. They change regularly with each new design/build. Intelligence is redefined with each iteration, emotionally and in processing ability. A robot of today is not the robot of the future, today. What the ASG are trying to do is develop a kind of social language, the laws of interaction between humans and robots beyond Asimov’s Three. How does a human react when a robot does this, and why? What is the best way for a robot to deliver a drink, how can a robot avoid injuring humans in delivering simple tasks?

Robot 2
Robot 2

Coming back to Nova’s talk — citing Kevin Slavin, “it becomes real by behaving real, by demonstrating the behaviour of things that are real” — we keep asking ‘how much of a human personality should you put into a robot?’

Over the course of two demonstrations, it became very clear that the role of human-ness in robotics is important. The Care-o-Bot 3® had two clear facets to its construction: a human-facing front, delivering objects and scanning the room; a mechanical reverse with massive arm for task-completion. The Care-o-Bot 3® moves like Mrs. Danvers in Rebecca, eerily smooth and slightly jarringly. It is neither wholly robotic nor wholly humanised. The uncanny valley is dipped into as the non-human movement sticks out more than it bleating “HERE IS YOUR DRINK” to complete its task. Communicating with human terms after decidedly non-human movements.

The second demonstration was of the Sunflower robot: a machine that definitely chimes with our expected ideas of a robot. Short, boxy, arms and a head with ‘eyes’. The eyes aren’t real, but are simply there for human benefit — to focus upon them, rather than looking for meaning in the rest of the machine. The Sunflower is the stranger of the two robots. It navigates a space, stops, recalibrates where it is. Occasionally it looks a bit lost and frightened, before it gets where it should be. It nags for attention when the doorbell rings, and is satiated when its master acknowledges it. It is, in the words of Bashford, a bit like Reality Clippy. Unlike Care-o-Bot, it uses flashing light systems to communicate if it needs attention or if a task is completed.

The other interesting concept at play in Sunflower is the ‘Migration’ mode, to make your robot portable. The Sunflower system is being designed as an “UbiCompanion“: the robot is not the machine, but the personality users have refined and customised. It is the functions that users have tweaked for their personal preferences. In Migration mode, users can transfer the robot’s consciousness, for wont of a better word, to other objects (think transferring a Mii character on your Wii to another console). This makes for a smarter system that jumps between a dumb-box on wheels, to skype, to desktop computers and anything that is connected. It’s a connected thing of the future, and very real.

Inside the house
Inside the house

The trip to the Secret Robot House was hugely enjoyable, and made for some interesting thoughts about behaviour and future-relationships. Huge thanks to Alex for organising and to LIREC for opening up their dizzying research. More organisations should invite plebs into their labs.

 

A robot in the house
A robot in the house