Extracts from an experiment/performance with the Autoscope, to see if the feeling of disembodied/remote presence is amplified through taking on a challenging physical task, and to explore how this affects our perception of the landscape. Thanks to Andrew P Brooks for film and photography. "Autoscope builds on laboratory-based simulations of out of body experiences; the portable device allows the participant to freely navigate the world, experiencing themselves in 3rd person, as part of the landscape via a live video feed to a head-mounted display. The visual mechanisms are important in this illusion, but tactile and sonic stimuli further strengthen the effects. https://www.youtube.com/watch?v=yY7bs2Lih0Y&t=386s
Notes from a day testing the Autoscope with Andrew Brooks. Here is our nod to the painting "Wanderer above the sea of fog" by Caspar David Friedrich 1818. The painting Juxtaposes 'Man' and his mastery of nature while simultaneously being a small and insignificant element of the overall landscape. The other mechanism at work is this inverted gaze; We see the back of the painted characters head, something that was quite unusual at the time. In doing this one idea is that we imagine ourselves as part of the image, putting our selves in the place of the depicted figure. It seems apt to recreate this, as one effect of the autoscope is this sensation that you have become part of the landscape and the feeling
Roger Newport gave a talk on his research at the Body Up conference last week. [See work with the MIRAGE box here Augmented Hands] Speaking to him afterwards, he asked if I had ever experienced the 6 finger Illusion? He reached into his bag and revealed a mirror which he placed on the table. Placing my hand one side of the mirror, he skilfully performed the illusion. Looking into the mirror I watched as each finger was stroked predictably in series, moving from thumb to the little finger, but then continuing to apparently stroke an invisible 6th finger. This worked perfectly for me. What is really interesting is that apparently a very small amount of people have reported seeing this
Autoscope builds on laboratory-based simulations of out of body experiences; the portable device allows the participant to freely navigate the world, experiencing themselves in 3rd person, as part of the landscape via a live video feed to a head-mounted display. The visual mechanisms are important in this illusion, but tactile and sonic stimuli further strengthen the effects. Autoscopy can be described as the disembodied perception of seeing one own body from an elevated or distanced location outside the body. The phenomena of the ‘out of body experience’ during heightened states of heightened consciousness or near-death experience tend to have spiritual or shamanistic connotations, but in recent times science has done much to demystify this phenomenon, identifying the neural mechanisms responsible.
My alloy backpack arrived and has proved perfect for the job of supporting a laptop and a 2m long boom. As an object, It certainly has a vintage feel and a clear aesthetic reference to Roman Signer. this seems entirely appropriate, something about the strangeness of the endeavour as well as the metaphorical notions of escape and travelling through the imagination.
"He felt dizzy, stood up, turned around, and saw himself still lying in bed. He was aware that the person in bed was him, and was not willing to get up and would thus make himself late for work. Furious at the prone self, the man shouted at it, shook it, and even jumped on it, all to no avail..." This article encapsulates some of the most important ideas, concepts and developments which lead towards the experiments on the simulation of out of body experiences. The article mentions Peter Brugger, Olaf Blanke and Thomas Metzinger, as well as some details of patient experiences, and the story of how this experiment came about. As well as the illusion of seeing one's self"autoscopy' There
‘Augmented hand series’ by Golan Levin, Chris Sugrue, and Kyle McDonald, 2013-2015) ‘MIRAGE Illusion Box’ (Roger Newport, 2008) These are two very similar projects which both transform the image of a hand in real time using Augmented Virtuality (AV, the digital manipulation of real-world objects). Both take the form of a black box, into which the hand is placed. When inside the participant can see their hand as if looking through a window into the box. Inside the box, a system of mirrors and motion tracking is used. An augmented or distorted digital image of the hand is relayed to a screen on the top of the box. Despite the technical similarities, both works stem from entirely different motivations. The ‘MIRAGE Illusion Box’ (Newport, 2018)
This International group interdisciplinary group adopt mechanisms employed in the cognitive sciences, such as the work of Mel Slater, and the arts. Their project ‘The Machine to Be Another’ allows anyone to experience a perspective from the body of an-other. The group speak of ‘expanding subjective experience’ and ‘understanding the relationship between identity and empathy from an embodied perspective’ (http://beanotherlab.org/) Using HMDs and live video, they have developed a number of critical applications. Investigating a wide range of issues including gender and disability. The group study the impact this work can have on people’s lives, employing methods of action research and co-creation. Be Another Lab embraces an open source approach. Sharing and developing their project through workshops, making the tools and
Notes from a meeting With Sally Linkenauger Lancaster University Psychology Dept. After discovering Sally's research at the BRnet conference I wanted to experience her experiments, so I asked if I could visit her lab. The VR lab looks like a normal office space, PCs scattered around the outside, and loads of VR tech hanging around. On closer inspection, a network of cameras is installed around the space to enable complex motion capture and augmented reality experiments to be undertaken. There is a treadmill in the corner. In the experiment I tried, a leap motion sensor captures the movements of your real hand, allowing you to articulate a virtual hand. The experiment requires some time to become used to the VR environment. At first,