This project was made with Elisa Spigai for bodily interaction course. We challenged the paradigm of creating new ways to interact with computers to use them, and thought in the ways computers use us. From the innocent messages of a computer that asks to be plugged to the power source because is running out of battery, or the printer asking us to change the color cartridge, to the idea of human domination by AI. The innocent game of trying to escape a decision tree, puts the player in the obligation of choosing against his will. The decisions that the user must take in order to escape, reflect respect for machines over humans.
We started by putting ourselves inside the point of view that maybe we are already being dominated by our systems. Opposed to what we believe to be doing when we plug our computer to the power source, (providing energy for the computer to serve us) we could actually be following the computer instructions to follow a bigger social instruction that makes us gears towards the development of more technology, in favour of the computer industry itself. In this way, the artificial intelligence is an evil force that has already enslaved us to make herself. From this idea, we wanted to develop a game that once the user is involved, he will become part of a plan whose intention becomes clear after the fact of having taken the decision, and create this moment of giving a thought to what has been already done, but can’t be undone. It was very hard to think on an idea that could be doable in a week. We went through ideas such as making a larp that involved all the information systems in the new media building, that would make many independent small actions of people to converge into a central action, that would reveal a bigger intention for previously innocent small glitches, emails, prints, etc. These first ideas were big and undoable. From a lot of discussion, we resolved on an idea of making an escape-room kind of game, where the intention of the machine would be revealed at the end of the game. Within this idea, we went from the idea of making something that would be in favour of the computer, to the idea of making the player feel that he might have been brainwashed by this evil machine.
Inspired in The Stanley Parable game, we created a decision tree based game, that would be the only mean of escape in an escape-room game context. In this game, the only way of getting to the end, is to choose what the computer wants the player to choose. The game first presents the user with challenges that seem to be play oriented, but start turning into strange choice questions, and finally they become choices between feeling empathy and support for machines over empathy and support for humans; most clearly in the last question towards the end, where the player must choose to feel worse about a robot being pushed by humans than a boy being bullied in school. We designed a general idea of the decision tree, and Elisa did an amazing job going into the detail of it.
We feel that, from the initial idea, this game could be much more than we were able to do in a week; we wanted to integrate many user actions into the idea of escaping the room, that would result in this sensation of self-betrayal at the end. Building such a story would be very interesting, but would require a big deal of time. Beside this, the result was really interesting, as people get really engaged in the game. To different extents, players have gotten cripped from nervous laughs to completely getting angry and quitting the game. The interesting and powerful feature of which we were not so aware, is that decision making is a very intimate and sensible process. When you induce people to choose something against their thoughts, there is a deep feeling of wrong, which is the reason why I feel that the In The Box project was a success: it forces a powerful awareness of our real means of freedom in front of machines. This project was tested at the end of the course, and also showed in the DASH Helsinki festival on june, 2016.