Chanting inside the tunnel of your mind: Why Soundself was the most original game at E3
“This build came together the day before E3,” Soundself creator Robin Arnott told me, putting the Oculus Rift over my head. I had played an earlier build of the game, but being inside the experience, complete with headphones that blocked out all other noise, was something else all together.
If you’ll excuse me, I’m about to sound like a hippy. My apologies.
The tunnel of light is all you see, and it twirls and spins and undulates. I began by humming a low note, and suddenly everything came alive around me. I could hear my own voice brought through the microphone, amplified through the headphones until I was surrounded by it, and the tunnel and shape in the foreground moved and swayed along with my voice. I sang a slightly higher note, and everything changed, reacting to my voice.
Your brain begins to feel odd in this experience, your consciousness doesn’t go away exactly, but you begin to feel broader. As a younger man I had experimented with disassociative drugs, and this provided a very similar sensation. Time became elastic, and I didn’t feel like I was humming, or chanting, as much as I was talking to the game. A few minutes later Arnott removed the virtual reality headset and it took me a few moments to come back to reality.
“We’re an experimental game, I don’t think anything has been done like this before. There’s nothing we can take for granted,” he told me when I asked about head tracking. Right now the visuals simply fill your field of view, you can’t “look around” inside the experience as if you were there. “The head tracking question brings up a lot of philosophical questions about the game in general. What you’re exploring is not a literal 3D space, it’s kind of a mental, abstract, surreal expanse of possibilities? I don’t know.”
It’s hard to talk about Soundself, even with other people who have played it extensively. In a way it feels like guided meditation. Arnott sent me the E3 build of the game after the show, although that took much begging, and I’ve been spending ten minute sessions inside, humming to myself, giving into the loss of self and sense of space.
That abstraction is part of the point, Arnott is adamant that he doesn’t want the game to give you the sense that you’re in a specific place. Looking around is a way to define your surroundings, and it could take you out of the experience.
“What we definitely do want to do is if you tilt your head up, the natural place for you to go next is down,” he explained, mimicking the motion. “If you tilt your head down, the natural place to go next is up. If we can get people to do these natural rhythmic motions…”
Suddenly we’re both bobbing our heads at each other. By adding feedback to an up and down movement in the game, people can fall into a rhythmic motion, which is another way to treat the game as a sort of meditative exercise.
“It’s the difference between being in control of something, and dancing with it,” Arnott said finally. In a way Soundself is just a visualization program, it takes the feedback you give it and turns it into patterns, shapes, and movement, but the way this is achieved makes it seem like the game is responding to you in a very gentle way. You’re not telling it what to do as much as you’re talking with it.
Last night I sat at my desk, a microphone close to my mouth, wireless headphones over my ears, the Rift covering my eyes. I sang a note, and the game began to speed up. I went higher, and the spinning tunnel of light began to move in a more insistent way, and it began to sing an even higher note back at me. We talked this way for five minutes or so, and for that time I was blissfully lost.