{"id":937698,"date":"2023-05-01T10:49:32","date_gmt":"2023-05-01T17:49:32","guid":{"rendered":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/"},"modified":"2023-05-01T15:36:12","modified_gmt":"2023-05-01T22:36:12","slug":"adhocprox-sensing-mobile-ad-hoc-collaborative-device-formations-using-dual-ultra-wideband-radios","status":"publish","type":"msr-research-item","link":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/publication\/adhocprox-sensing-mobile-ad-hoc-collaborative-device-formations-using-dual-ultra-wideband-radios\/","title":{"rendered":"AdHocProx: Sensing Mobile, Ad-Hoc Collaborative Device Formations using Dual Ultra-Wideband Radios"},"content":{"rendered":"<p>We present AdHocProx, a system that uses device-relative, inside-out sensing to augment co-located collaboration across multiple devices, without recourse to externally-anchored beacons \u2013 or even reliance on WiFi connectivity.<\/p>\n<p>AdHocProx achives this via sensors including dual ultra-wideband (UWB) radios for sensing distance and angle to other devices in dynamic, ad-hoc arrangements; plus capacitive grip to determine where the user\u2019s hands hold the device, and to partially correct for the resulting UWB signal attenuation. All spatial sensing and communication takes place via the side-channel capability of the UWB radios, suitable for small-group collaboration across up to four devices (eight UWB radios).<\/p>\n<p>Together, these sensors detect proximity and natural, socially meaningful device movements to enable contextual interaction techniques. We find that AdHocProx can obtain 95% accuracy recognizing various ad-hoc device arrangements in an offline evaluation, with participants particularly appreciative of interaction techniques that automatically leverage proximity-awareness and relative orientation amongst multiple devices.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-937725\" src=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2023\/05\/AdHocProx-Fig-1-1024x448.jpg\" alt=\"Figure 1: Wide strip panel titled \"AdHocProx: Sensing mobile ad-hoc device formations\". Left-most features two illustrated people at a table using devices together, labeled \"Inside-out sensing to detect presence and relative orientation of nearby devices.\". Next features two standing people using devices together, labeled \"Distance- and orientation-aware co-located collaborative multi-device experiences.\". Next depicts a rectangle with components labeled: capacitive grip sensors, dual DW3000 ultra-wideband (UWB) radios, and device-integrated inertial measurement unit (IMU). Finally, a series of illustrated devices in motion labeled \"Cross-device interaction scenarios for dynamic, ad-hoc device formations\", with subtitles \"Moving or copying content through portals\" and \"Notes for ad- hoc annotations on second device.\".\" width=\"768\" height=\"336\" srcset=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2023\/05\/AdHocProx-Fig-1-1024x448.jpg 1024w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2023\/05\/AdHocProx-Fig-1-300x131.jpg 300w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2023\/05\/AdHocProx-Fig-1-768x336.jpg 768w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2023\/05\/AdHocProx-Fig-1-1536x671.jpg 1536w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2023\/05\/AdHocProx-Fig-1-240x105.jpg 240w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2023\/05\/AdHocProx-Fig-1.jpg 1734w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/p>\n","protected":false},"excerpt":{"rendered":"<p>We present AdHocProx, a system that uses device-relative, inside-out sensing to augment co-located collaboration across multiple devices, without recourse to externally-anchored beacons \u2013 or even reliance on WiFi connectivity. AdHocProx achives this via sensors including dual ultra-wideband (UWB) radios for sensing distance and angle to other devices in dynamic, ad-hoc arrangements; plus capacitive grip to [&hellip;]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":null,"msr_publishername":"ACM","msr_publisher_other":"","msr_booktitle":"","msr_chapter":"","msr_edition":"","msr_editors":"","msr_how_published":"","msr_isbn":"","msr_issue":"","msr_journal":"","msr_number":"","msr_organization":"","msr_pages_string":"","msr_page_range_start":"Article No. 623: 1","msr_page_range_end":"18","msr_series":"","msr_volume":"","msr_copyright":"","msr_conference_name":"CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems","msr_doi":"","msr_arxiv_id":"","msr_s2_paper_id":"","msr_mag_id":"","msr_pubmed_id":"","msr_other_authors":"","msr_other_contributors":"","msr_speaker":"","msr_award":"","msr_affiliation":"","msr_institution":"","msr_host":"","msr_version":"","msr_duration":"","msr_original_fields_of_study":"","msr_release_tracker_id":"","msr_s2_match_type":"","msr_citation_count_updated":"","msr_published_date":"2023-4-23","msr_highlight_text":"","msr_notes":"","msr_longbiography":"","msr_publicationurl":"","msr_external_url":"","msr_secondary_video_url":"","msr_conference_url":"","msr_journal_url":"","msr_s2_pdf_url":"","msr_year":0,"msr_citation_count":0,"msr_influential_citations":0,"msr_reference_count":0,"msr_s2_match_confidence":0,"msr_microsoftintellectualproperty":true,"msr_s2_open_access":false,"msr_s2_author_ids":[],"msr_pub_ids":[],"msr_hide_image_in_river":0,"footnotes":""},"msr-research-highlight":[],"research-area":[13554],"msr-publication-type":[193716],"msr-publisher":[],"msr-focus-area":[],"msr-locale":[268875],"msr-post-option":[],"msr-field-of-study":[248485],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-937698","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-human-computer-interaction","msr-locale-en_us","msr-field-of-study-human-computer-interaction"],"msr_publishername":"ACM","msr_edition":"","msr_affiliation":"","msr_published_date":"2023-4-23","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"","msr_volume":"","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"","msr_publicationurl":"","msr_doi":"","msr_publication_uploader":[{"type":"file","viewUrl":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2023\/05\/AdHocProx-CHI-2023.pdf","id":"937707","title":"adhocprox-chi-2023","label_id":"243109","label":0},{"type":"doi","viewUrl":"false","id":"false","title":"https:\/\/doi.org\/3544548.3581300","label_id":"243106","label":0}],"msr_related_uploader":"","msr_citation_count":0,"msr_citation_count_updated":"","msr_s2_paper_id":"","msr_influential_citations":0,"msr_reference_count":0,"msr_arxiv_id":"","msr_s2_author_ids":[],"msr_s2_open_access":false,"msr_s2_pdf_url":null,"msr_attachments":[{"id":937707,"url":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2023\/05\/AdHocProx-CHI-2023.pdf"}],"msr-author-ordering":[{"type":"text","value":"Richard Li","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Teddy Seyed","user_id":38721,"rest_url":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Teddy Seyed"},{"type":"user_nicename","value":"Nicolai Marquardt","user_id":42630,"rest_url":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Nicolai Marquardt"},{"type":"text","value":"Eyal Ofek","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Steve Hodges","user_id":33628,"rest_url":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Steve Hodges"},{"type":"text","value":"Mike Sinclair","user_id":0,"rest_url":false},{"type":"guest","value":"hugo-romat","user_id":696222,"rest_url":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=hugo-romat"},{"type":"user_nicename","value":"Michel Pahud","user_id":33007,"rest_url":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Michel Pahud"},{"type":"text","value":"Jatin Sharma","user_id":0,"rest_url":false},{"type":"text","value":"William A. S. Buxton","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Ken Hinckley","user_id":32521,"rest_url":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Ken Hinckley"},{"type":"user_nicename","value":"Nathalie Henry Riche","user_id":33058,"rest_url":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Nathalie Henry Riche"}],"msr_impact_theme":[],"msr_research_lab":[199565],"msr_event":[],"msr_group":[379814,1105932],"msr_project":[937905,698833],"publication":[],"video":[],"msr-tool":[],"msr_publication_type":"inproceedings","related_content":{"projects":[{"ID":937905,"post_title":"Transcendence","post_name":"transcendence","post_type":"msr-project","post_date":"2023-05-01 15:33:11","post_modified":"2024-04-05 08:00:06","post_status":"publish","permalink":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/project\/transcendence\/","post_excerpt":"Reinventing how we work together The Transcendence Project at Microsoft Research is reimagining interaction, productivity, and collaboration, harnessing the power of AI to transcend space, time, and modality, and redefine how we work together in the future.","_links":{"self":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/937905"}]}},{"ID":698833,"post_title":"SurfaceFleet","post_name":"surfacefleet","post_type":"msr-project","post_date":"2020-10-16 15:09:07","post_modified":"2023-04-21 22:31:42","post_status":"publish","permalink":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/project\/surfacefleet\/","post_excerpt":"SurfaceFleet is a system and toolkit that uses resilient and performant distributed programming techniques to explore cross-device user experiences. With appropriate design, these technologies afford mobility of user activity unbounded by device, application, user, and time. The vision of the project is to enable a future where an ecosystem of technologies seamlessly transition user activity from one place to another, whether that \u201cplace\u201d takes the form of a literal location, a different device form-factor, the&hellip;","_links":{"self":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/698833"}]}}]},"_links":{"self":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/937698","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":8,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/937698\/revisions"}],"predecessor-version":[{"id":937818,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/937698\/revisions\/937818"}],"wp:attachment":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/media?parent=937698"}],"wp:term":[{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=937698"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=937698"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=937698"},{"taxonomy":"msr-publisher","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-publisher?post=937698"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=937698"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=937698"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=937698"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=937698"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=937698"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=937698"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=937698"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=937698"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}