‘Augmented hand series’ by Golan Levin, Chris Sugrue, and Kyle McDonald, 2013-2015) ‘MIRAGE Illusion Box’ (Roger Newport, 2008) These are two very similar projects which both transform the image of a hand in real time using Augmented Virtuality (AV, the digital manipulation of real-world objects). Both take the form of a black box, into which the hand is placed. When inside the participant can see their hand as if looking through a window into the box. Inside the box, a system of mirrors and motion tracking is used. An augmented or distorted digital image of the hand is relayed to a screen on the top of the box. Despite the technical similarities, both works stem from entirely different motivations. The ‘MIRAGE Illusion Box’ (Newport, 2018)
Tag: interface
Meeting With Sally Linkenauger
Notes from a meeting With Sally Linkenauger Lancaster University Psychology Dept. After discovering Sally's research at the BRnet conference I wanted to experience her experiments, so I asked if I could visit her lab. The VR lab looks like a normal office space, PCs scattered around the outside, and loads of VR tech hanging around. On closer inspection, a network of cameras is installed around the space to enable complex motion capture and augmented reality experiments to be undertaken. There is a treadmill in the corner. In the experiment I tried, a leap motion sensor captures the movements of your real hand, allowing you to articulate a virtual hand. The experiment requires some time to become used to the VR environment. At first,
Seeing with the tongue – Paul Bach-y-Rita –
"You don't see with the eyes. You see with the brain" Paul Bach-y-Rita, Science News Online 1 Sept, 2001; vol. 169, no. 9 Paul Bach-y-Rita was one of the first neuroscientists to study the idea of Neuroplasticity. He did a number of experiments on sensory substitution, developed the idea of "Brain Port" in 1998. This interface uses a camera to feed an image to an electrode array placed on the tongue. Blind patients were able to see using the tongue. And a chair, the back of which is packed with solenoids, the camera feeds an image to these. After time and with training the user is able to sense images, at first basic shapes, and with more time can sense more detail such
Tactile Anchoring Device Prototype 1
Here is my prototype device intended to help autonomously generate the 'invisible hand illusion'. For this experiment, I created a series of brushes which rotate at different speeds stroking empty space. The idea was that the participant watches this device, while the brushing motion is replicated one their real hand hidden nearby. This is building towards a piece of work called 'On the embodiment of a discrete Volume of Empty Space' [ See http://antonyhall.net/blogtactile-anchoring-device/ ] https://www.youtube.com/watch?v=DI-f6KgRD2w&w=700&h=400
Augmented senses
In this experiment, they created a simple device [Hearspace App] incorporating a compass and headphones. It "allows users to reliably hear the direction of magnetic North as a stable sound object in external space on a headphone. They found that "long-lasting integration into the perception of self-rotation. Short training with amplified or reduced rotation gain in the magnetic signal can expand or compress the perceived extent of vestibular self-rotation, even with the magnetic signal absent in the test" I was struck by this statement "sensory substitution and augmentation research has aimed to restore sensory functionality from non-invasive afferent signals of artificial sensors...there has been little concrete evidence that truly perceptual experiences have ever been obtained via this approach" Sensory augmentation: integration of an auditory
Fish-brain-machine / Radiona workshop
After constructing the Fish-brain-machine PCB circuits we spent some time experimenting and describing the hallucinogenic visuals created by the stroboscopic light. The ping pong balls over the eyes diffuse the LED light, making for a more intense effect - and enabling use with eyes open. Here they describe some of the effects including seeing colours and 'a strange experience' of seeing with only one eye - I get this exact same feeling when using it. It is also hard to know if your eyes are open or closed. [See also Re-mapping the senses workshop ] https://www.youtube.com/watch?v=Lwne5xICLkI
Fish-brain-machine
As part of my Enki exhibition at Kapellica Gallery in Ljubljana 2012, I developed a related perceptual illusions and brain hacks workshop with Marc Dusseiller [Hackteria], as part of the gallery’s Biotech program. We came up with the idea to make a special issue circuit for the workshop and we set to work designing a circuit the encapsulated the Enki project in miniature. After a couple late nights, we came up with this super cool PCB design. Marc worked hard to create a fully functional efficient design, which was also aesthetically pleasing. The outline of the fish is also the ground in the circuit. This has to be the most ultra minimal brain-machine available to build. 6 components. We spent further
Face as Interface
Trying to create a simple motion tracking patch using PD_extended and Gem, I came across this project by Elektro Moon Vision http://elektromoon.co.nr/ the mini App provides OSC data from movements such as eyebrows, nose, mouth, orientation scale etc. This is massively useful for an experiment I have in mind related to the "strange face in the mirror illusion" The data can be captured and used to control a 3D model in virtual space for example. Matching rotation, scale and orientation to the model and the movement of my head... This is a simple motion detection patch that tracks the difference between two frames creating ghostly outlines of momentarily disembodied features. It's sensitive enough to pick up facial expressions such as the movement of muscles and