{"id":298,"date":"2018-01-17T16:45:31","date_gmt":"2018-01-17T16:45:31","guid":{"rendered":"https:\/\/antonyhallphd.wordpress.com\/?p=298"},"modified":"2019-08-14T19:49:42","modified_gmt":"2019-08-14T19:49:42","slug":"face-as-interface","status":"publish","type":"post","link":"https:\/\/antonyhall.net\/blog\/face-as-interface\/","title":{"rendered":"Face as Interface"},"content":{"rendered":"<p>Trying to create a simple motion tracking patch using PD_extended and Gem, I came across this project by Elektro Moon Vision <a href=\"http:\/\/elektromoon.co.nr\/\">http:\/\/elektromoon.co.nr\/ <\/a>the mini App provides OSC data from movements such as eyebrows, nose, mouth, orientation scale etc. This is massively&nbsp;useful for an experiment I have in mind related to the <a href=\"http:\/\/antonyhall.net\/blog\/my-experience-of-the-strange-face-illusion\/\">&#8220;strange face in the mirror illusion&#8221;<\/a> The data can be captured and used to control a 3D model in virtual space for example. Matching rotation, scale and orientation to the model and the movement of my head&#8230;<\/p>\n<div id='gallery-1' class='gallery galleryid-298 gallery-columns-3 gallery-size-thumbnail'><figure class='gallery-item'>\n\t\t\t<div class='gallery-icon portrait'>\n\t\t\t\t<a href='https:\/\/antonyhall.net\/blog\/img_1465\/'><img width=\"150\" height=\"150\" src=\"https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/img_1465-150x150.jpg\" class=\"attachment-thumbnail size-thumbnail\" alt=\"\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/img_1465-150x150.jpg 150w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/img_1465-300x300.jpg 300w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/img_1465-100x100.jpg 100w\" sizes=\"(max-width: 150px) 100vw, 150px\" \/><\/a>\n\t\t\t<\/div><\/figure><figure class='gallery-item'>\n\t\t\t<div class='gallery-icon landscape'>\n\t\t\t\t<a href='https:\/\/antonyhall.net\/blog\/screen-shot-2018-01-17-at-16-36-48\/'><img width=\"150\" height=\"150\" src=\"https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-17-at-16-36-48-150x150.png\" class=\"attachment-thumbnail size-thumbnail\" alt=\"\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-17-at-16-36-48-150x150.png 150w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-17-at-16-36-48-300x300.png 300w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-17-at-16-36-48-100x100.png 100w\" sizes=\"(max-width: 150px) 100vw, 150px\" \/><\/a>\n\t\t\t<\/div><\/figure>\n\t\t<\/div>\n\n<p>This is a simple motion detection patch that tracks the difference&nbsp;between two frames creating ghostly outlines of momentarily disembodied&nbsp;features. It&#8217;s sensitive enough&nbsp;to pick up facial expressions such as the movement of muscles and eyeballs. A combination of these two systems should be enough to develop a system of projecting on to a face and follow its movements.<\/p>\n<figure id=\"attachment_media-19\" aria-describedby=\"caption-attachment-media-19\" style=\"width: 2560px\" class=\"wp-caption alignnone\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-308\" src=\"https:\/\/antonyhallphd.files.wordpress.com\/2018\/01\/screen-shot-2018-01-19-at-12-50-44.png\" alt=\"Screen Shot 2018-01-19 at 12.50.44.png\" width=\"2560\" height=\"1600\" srcset=\"https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-50-44.png 2560w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-50-44-600x375.png 600w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-50-44-300x188.png 300w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-50-44-768x480.png 768w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-50-44-1024x640.png 1024w\" sizes=\"(max-width: 2560px) 100vw, 2560px\" \/><figcaption id=\"caption-attachment-media-19\" class=\"wp-caption-text\">In this video still, the tracking detects the movement of my left side of the face as I smile on one side, to see if it detects the movement of my mouth as well as cheek&nbsp;muscles. the Red circle detects the centre of mass of the image&nbsp;<\/figcaption><\/figure>\n<figure id=\"attachment_304\" aria-describedby=\"caption-attachment-304\" style=\"width: 2560px\" class=\"wp-caption alignnone\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-304\" src=\"https:\/\/antonyhallphd.files.wordpress.com\/2018\/01\/screen-shot-2018-01-19-at-12-49-53.png\" alt=\"Screen Shot 2018-01-19 at 12.49.53\" width=\"2560\" height=\"1600\" srcset=\"https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-49-53.png 2560w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-49-53-600x375.png 600w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-49-53-300x188.png 300w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-49-53-768x480.png 768w, https:\/\/antonyhall.net\/blog\/wp-content\/uploads\/2018\/01\/screen-shot-2018-01-19-at-12-49-53-1024x640.png 1024w\" sizes=\"(max-width: 2560px) 100vw, 2560px\" \/><figcaption id=\"caption-attachment-304\" class=\"wp-caption-text\">Sensing motion by measuring differences in live video frames Using Pure Data and GEM<\/figcaption><\/figure>\n<p><a href=\"http:\/\/antonyhall.net\/blog\/my-experience-of-the-strange-face-illusion\/\">See here for the experiments with the strange face in the mirror illusion so far&#8230;<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Trying to create a simple motion tracking patch using PD_extended and Gem, I came across this project by Elektro Moon Vision http:\/\/elektromoon.co.nr\/ the mini App provides OSC data from movements such as eyebrows, nose, mouth, orientation scale etc. This is massively&nbsp;useful for an experiment I have in mind related to the &#8220;strange face in the&#8230;<\/p>\n","protected":false},"author":1,"featured_media":304,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[240],"tags":[6,50,60,62,177,85],"_links":{"self":[{"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/posts\/298"}],"collection":[{"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/comments?post=298"}],"version-history":[{"count":2,"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/posts\/298\/revisions"}],"predecessor-version":[{"id":1339,"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/posts\/298\/revisions\/1339"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/media\/304"}],"wp:attachment":[{"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/media?parent=298"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/categories?post=298"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/antonyhall.net\/blog\/wp-json\/wp\/v2\/tags?post=298"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}