{"id":284006,"date":"2016-03-07T14:56:36","date_gmt":"2016-03-07T22:56:36","guid":{"rendered":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/?post_type=msr-research-item&#038;p=284006"},"modified":"2016-08-29T08:42:15","modified_gmt":"2016-08-29T15:42:15","slug":"enabling-a-future-of-perceptive-assistance-system-support-for-efficiency-and-privacy-in-continuous-mobile-vision","status":"publish","type":"msr-video","link":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/video\/enabling-a-future-of-perceptive-assistance-system-support-for-efficiency-and-privacy-in-continuous-mobile-vision\/","title":{"rendered":"Enabling a Future of Perceptive Assistance: System Support for Efficiency and Privacy in Continuous Mobile Vision"},"content":{"rendered":"<p>In my vision for the future, wearable computing devices will interpret rich visual real-world environments in real-time, providing services to assist in our daily lives with &#8220;Continuous Mobile Vision&#8221;. For example, by remembering faces and objects in our personal encounters, the device can maintain a visual search engine relevant to the user. I show that today&#8217;s system software and imaging hardware are ill-suited for such continuous mobile vision. Highly optimized for photography, current systems fail to attain a sufficient level of energy efficiency and privacy preservation. I present my rethinking of the vision system stack in application frameworks, operating system drivers, and sensor architecture to attain energy efficiency. This cross layer work contributes: (1) scalability to multiple vision applications, (2) mechanisms for energy proportional image capture, and (3) early processing to reduce the image sensor&#8217;s readout burden. Altogether, this results in two orders of magnitude improved efficiency in vision processing. Progressing into the future, in addition to seeking further OS and architectural opportunities for efficiency, I seek to enable continuous mobile vision to work more securely and privately, innovating low-level mechanisms that enable high-level policies for the vision stack. By providing a narrow, monitored view of vision data access, such privacy mechanisms will enable users to trust the software stack of vision processing. In the long term, I will push for a future of continuous mobile vision will enable a new wave of personal computing assistance, wherein a perception of the real world will help our devices relieve our precious human memory and attention.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In my vision for the future, wearable computing devices will interpret rich visual real-world environments in real-time, providing services to assist in our daily lives with &#8220;Continuous Mobile Vision&#8221;. For example, by remembering faces and objects in our personal encounters, the device can maintain a visual search engine relevant to the user. I show that [&hellip;]<\/p>\n","protected":false},"featured_media":275604,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_hide_image_in_river":0,"footnotes":""},"research-area":[13562,13554],"msr-video-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-session-type":[],"msr-impact-theme":[],"msr-pillar":[],"msr-episode":[],"msr-research-theme":[],"class_list":["post-284006","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-research-area-computer-vision","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/0TE0VN8Tne4","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/284006","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":0,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/284006\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/media\/275604"}],"wp:attachment":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/media?parent=284006"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=284006"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=284006"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=284006"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=284006"},{"taxonomy":"msr-session-type","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-session-type?post=284006"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=284006"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=284006"},{"taxonomy":"msr-episode","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-episode?post=284006"},{"taxonomy":"msr-research-theme","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-research-theme?post=284006"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}