I have been working with Manchester Science Partnerships to develop a range of workshops for their customers, the resident companies that use the park. The first session was the 'mirror gaze experiment'. During the mirror gaze experiment [MGE] participants are asked to stare at their own reflection in a mirror in a nearly dark room. An outline of the head is visible as a faint silhouette. In this state of partial sensory deprivation, the brain struggles to make sense of the information it sees. Forms and shapes begin to emerge as if from nowhere. For many observers, these develop into vivid visual hallucinations “monsters, archetypical faces, faces of relatives, and animals” (Caputo, 2012; Bortolomasi et al., 2014). This
‘Augmented hand series’ by Golan Levin, Chris Sugrue, and Kyle McDonald, 2013-2015) ‘MIRAGE Illusion Box’ (Roger Newport, 2008) These are two very similar projects which both transform the image of a hand in real time using Augmented Virtuality (AV, the digital manipulation of real-world objects). Both take the form of a black box, into which the hand is placed. When inside the participant can see their hand as if looking through a window into the box. Inside the box, a system of mirrors and motion tracking is used. An augmented or distorted digital image of the hand is relayed to a screen on the top of the box. Despite the technical similarities, both works stem from entirely different motivations. The ‘MIRAGE Illusion Box’ (Newport, 2018)
This International group interdisciplinary group adopt mechanisms employed in the cognitive sciences, such as the work of Mel Slater, and the arts. Their project ‘The Machine to Be Another’ allows anyone to experience a perspective from the body of an-other. The group speak of ‘expanding subjective experience’ and ‘understanding the relationship between identity and empathy from an embodied perspective’ (http://beanotherlab.org/) Using HMDs and live video, they have developed a number of critical applications. Investigating a wide range of issues including gender and disability. The group study the impact this work can have on people’s lives, employing methods of action research and co-creation. Be Another Lab embraces an open source approach. Sharing and developing their project through workshops, making the tools and
This experiment induces the sensation of a phantom presence in the room. The participant blindfolded is asked to use a stylus to prod an empty space in front of them. Using a tactile feedback system and a robot arm, the participant feels as if they are prodding themselves in the back. As the experiment progresses the system adds a delay to this prodding. At this stage, the participants become freaked out believing that someone else if prodding them... https://youtu.be/GnusbO8QjbE Link to the paper... Neurological and Robot-Controlled Induction of an Apparition https://www.cell.com/current-biology/abstract/S0960-9822(14)01212-3
I teamed up with artist Annie Carpenter to pull together a small group of artists and friends for a night of ‘fieldwork’. We organised an overnighter to do some experiments and have discussions together in the relaxed atmosphere of Middlewood Trust study centre; an off-grid permaculture farm. I had worked here before with [Annie and Sam Illingworth] doing some workshops with their students on a previous 'field research' style project. The concept captured my interest. We wanted to create a situation where we could work as well as have time and space to chat about ideas with others. I decided rather than drive my car, I would take a few days out and make a road trip out of it and cycle.
After constructing the Fish-brain-machine PCB circuits we spent some time experimenting and describing the hallucinogenic visuals created by the stroboscopic light. The ping pong balls over the eyes diffuse the LED light, making for a more intense effect - and enabling use with eyes open. Here they describe some of the effects including seeing colours and 'a strange experience' of seeing with only one eye - I get this exact same feeling when using it. It is also hard to know if your eyes are open or closed. [See also Re-mapping the senses workshop ] https://www.youtube.com/watch?v=Lwne5xICLkI
Trying to create a simple motion tracking patch using PD_extended and Gem, I came across this project by Elektro Moon Vision http://elektromoon.co.nr/ the mini App provides OSC data from movements such as eyebrows, nose, mouth, orientation scale etc. This is massively useful for an experiment I have in mind related to the "strange face in the mirror illusion" The data can be captured and used to control a 3D model in virtual space for example. Matching rotation, scale and orientation to the model and the movement of my head... This is a simple motion detection patch that tracks the difference between two frames creating ghostly outlines of momentarily disembodied features. It's sensitive enough to pick up facial expressions such as the movement of muscles and