{"id":579130,"date":"2019-04-18T09:00:20","date_gmt":"2019-04-18T16:00:20","guid":{"rendered":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/?p=579130"},"modified":"2019-04-29T12:47:40","modified_gmt":"2019-04-29T19:47:40","slug":"advancing-accessibility-on-the-web-in-virtual-reality-and-in-the-classroom","status":"publish","type":"post","link":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/blog\/advancing-accessibility-on-the-web-in-virtual-reality-and-in-the-classroom\/","title":{"rendered":"Advancing accessibility on the web, in virtual reality, and in the classroom"},"content":{"rendered":"<p><a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-579727 size-large\" src=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-1024x576.png\" alt=\"The 14 SeeingVR tools, overlaid individually upon a scene from the open source Unity game EscapeVR-HarryPotter; end-users can combine the individual tools as needed, based on their visual abilities.\" width=\"1024\" height=\"576\" srcset=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-1024x576.png 1024w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-300x169.png 300w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-768x432.png 768w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-1066x600.png 1066w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-655x368.png 655w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-343x193.png 343w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788.png 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><\/p>\n<p>At the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/chi2019.acm.org\/\">ACM CHI Conference on Human Factors in Computing Systems<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> conference in Glasgow, Scotland this May, researchers from Microsoft\u2019s Redmond and UK labs, together with our university collaborators, will be presenting several papers and demos that explore how to design technologies more inclusively, to support accessibility by users with cognitive and\/or sensory disabilities.<\/p>\n<p>Microsoft researchers Adam Fourney, Kevin Larson, and myself teamed up with University of Washington researchers Qisheng Li and Katharina Reinecke to explore the accessibility of the Web to people with dyslexia. Dyslexia is a cognitive disability estimated to affect about 15% of English-speaking adults; people with dyslexia can experience varying degrees of difficulty with reading-related tasks. Because access to information on the Web is a key modern literacy skill, ensuring that online information is cognitively accessible is an important concern; beyond people with dyslexia, improving cognitive access to the Web may benefit other groups who experience reading challenges such as English language learners or children.<\/p>\n<p>At CHI 2019, lead author and University of Washington graduate student Qisheng Li will present the Microsoft Research-UW team\u2019s findings, summarized in their paper, \u201c<a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/01\/Li_Reading-View_CHI19.pdf\">The Impact of Web Browser Reader Views on Reading Speed and User Experience.<\/a>\u201d The team explored whether the \u201creading mode\u201d common in most modern browsers significantly impacted users\u2019 reading speed and comprehension, and whether users with dyslexia specifically benefitted from this intervention. Using the \u201c<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/www.labinthewild.org\/\">Lab in the Wild<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>\u201d infrastructure developed by Professor Reinecke, the team conducted an online study with 391 English-speaking adults (42 with dyslexia), in which participants read several popular webpages and answered associated reading-comprehension questions, some in the typical browser view and some in the reading mode.<\/p>\n<div id=\"attachment_579148\" style=\"width: 787px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/readerview_browser.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-579148\" class=\"wp-image-579148 size-full\" src=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/readerview_browser.png\" alt=\"A webpage in typical browser view (left), and in the reading mode (right).\" width=\"777\" height=\"489\" srcset=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/readerview_browser.png 777w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/readerview_browser-300x189.png 300w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/readerview_browser-768x483.png 768w\" sizes=\"auto, (max-width: 777px) 100vw, 777px\" \/><\/a><p id=\"caption-attachment-579148\" class=\"wp-caption-text\">A webpage in typical browser view (left), and in the reading mode (right).<\/p><\/div>\n<p>As expected, people with dyslexia had substantially slower reading speeds than people without dyslexia; however, people with dyslexia did not seem to receive any differential benefit of the reading mode. Instead, the team found that reader view overall enhanced reading speed of all users by about 5%, as compared to the default website view. However, the study found that reader mode buttons are disabled by default, and that the rules governing the availability of reader mode are opaque to web developers. Only 41% of 1100 popular webpages sampled successfully enabled reader view. Our findings suggest that web page designers should develop their page in a way that enables the reader mode button in major browsers, so that users can have the option to reap this reading-speed benefit. Making it easier for web developers to intentionally enable the reading mode option as well as exploring which particular aspects of the reader view transformations provide the most benefit are key areas for future work.<\/p>\n<p>Accessibility beyond the traditional desktop computing experience is also a focus of Microsoft Research\u2019s contributions to CHI this year. Intern Yuhang Zhao, a graduate student at Cornell Tech, will present a paper summarizing joint research with Microsoft researchers Ed Cutrell, Christian Holz, Eyal Ofek, myself, and Andrew Wilson that explores how to enhance the accessibility of emerging virtual reality (VR) technologies: \u201c<a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/01\/SeeingVRchi2019.pdf\">SeeingVR: A Set of Tools to Make Virtual Reality More Accessible to People with Low Vision.<\/a>\u201d The team will also present a live demo during the conference\u2019s demonstration session and at the Microsoft booth; blog readers can experience a demo by viewing <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"http:\/\/aka.ms\/seeingvrvideo\">the project\u2019s online video figure<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.<\/p>\n<p>Low vision (that is, visual disabilities that cannot be fully corrected by glasses) impacts 217 million people worldwide according to the World Health Organization. While desktop software offers some accommodation features for people with low vision (for example, screen magnifiers), VR systems have not yet grappled with the issue of accessibility for this audience. Indeed, when interviewing VR developers, the team found that none had received training or guidance on how to develop accessible VR experiences.<\/p>\n<p>Because low vision encompasses a range of visual abilities (for example, tunnel vision, blind spots, brightness sensitivity, low visual acuity, and so on), the team took a toolkit approach\u2014they developed SeeingVR, a set of 14 tools for Unity developers (Unity is one of the most widely-used VR development platforms). End-users can activate different combinations of these tools depending on their abilities and the context of the current application and task. Example tools include a magnifier and bifocal views, brightness and contrast adjustment for the scene, edge-enhancement to make virtual objects more salient from their backgrounds, depth measurement tools, and the ability to point at text or objects in a virtual scene to have them read or described aloud. The majority of these tools can be applied to existing Unity applications post-hoc, to support easy adoption.<\/p>\n<div id=\"attachment_579175\" style=\"width: 1034px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/low_vision.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-579175\" class=\"wp-image-579175 size-large\" src=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/low_vision-1024x283.png\" alt=\"The 14 SeeingVR tools, overlaid individually upon a scene from the open source Unity game EscapeVR-HarryPotter; end-users can combine the individual tools as needed, based on their visual abilities.\" width=\"1024\" height=\"283\" srcset=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/low_vision-1024x283.png 1024w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/low_vision-300x83.png 300w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/low_vision-768x212.png 768w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><p id=\"caption-attachment-579175\" class=\"wp-caption-text\">The 14 SeeingVR tools, overlaid individually upon a scene from the open source Unity game EscapeVR-HarryPotter; end-users can combine the individual tools as needed, based on their visual abilities.<\/p><\/div>\n<p>Evaluation with 11 people with low vision completing a variety of tasks in VR (for example, menu selection, grasping objects, shooting moving targets) found that all participants could complete tasks more quickly and accurately when using SeeingVR tools as compared to the default VR experience. All participants chose different combinations of the available tools, reinforcing the value of allowing flexibility and customization of low vision accessibility options.<\/p>\n<p>Microsoft Research researchers are also exploring non-visual representations of VR for people who are completely blind. <a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/product\/soundscape\/\">Microsoft Soundscape<\/a> is a smartphone application that uses spatial audio to deliver a rich, non-visual navigation experience. At the CHI 2019 workshop on \u201cHacking Blind Navigation\u201d (co-organized by Principal Researcher Ed Cutrell), Microsoft Research intern and University of Washington student Anne Spencer Ross will present research on how to craft an audio-only VR experience that can allow people to rehearse a walking route virtually before experiencing the route in the physical world via Soundscape. Her paper, \u201c<a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/publication\/use-cases-and-impact-of-audio-based-virtual-exploration\/\">Use Cases and Impact of Audio-Based Virtual Exploration<\/a>\u201d is a collaboration between engineers from the Soundscape team (Melanie Kneisel and Alex Fiannaca) and researchers in the Microsoft Research Redmond Lab (Ed Cutrell and myself). Melanie Kneisel will also be a featured speaker at the workshop.<\/p>\n<p>In addition to presenting research on accessible Web browsing and accessible VR, researchers from Microsoft\u2019s Cambridge, UK lab will be sharing a tangible toolkit to enhance the accessibility of computer science education for children who are blind. Led by researcher Cecily Morrison, the <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/nam06.safelinks.protection.outlook.com\/?url=https%3A%2F%2Fnews.microsoft.com%2Finnovation-stories%2Fproject-torino-code-jumper%2F&data=02%7C01%7Cmerrie%40microsoft.com%7C0c7f7974f0ed447cbe9208d6aeb6088f%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C636888494200701743&sdata=c5sZiDgLswZGk5K7H%2B1ExU01NfumraV6RVe4ykK%2Bi8s%3D&reserved=0\">CodeJumper<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> project is a physical programming language for teaching children ages 7 \u2013 11 basic programming concepts and computational thinking regardless of level of vision. It was inspired by the need to provide a way for young blind and low vision students to access the computing curriculum inclusively alongside their sighted peers. Children plug together pods that represent lines of code in a program to create programs that when run make music, stories, or poetry. Children can start with very simple concepts\u2014such as, a program is a sequence of commands\u2014and progress to complicated program flows that utilise variables, covering the whole of the curriculum for this age band. It was successfully tested with 75 children and 30 teachers across the United Kingdom and found to support age-appropriate learning of coding as well as encouraging whole-child learning, such as creating friendships with sighted peers. The tangible CodeJumper kit will be available for CHI participants to experience during the conference\u2019s demo session.<\/p>\n<p>We look forward to seeing you at CHI 2019 in Glasgow and sharing ideas and advancing the accessibility conversation together.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>At the ACM CHI Conference on Human Factors in Computing Systems conference in Glasgow, Scotland this May, researchers from Microsoft\u2019s Redmond and UK labs, together with our university collaborators, will be presenting several papers and demos that explore how to design technologies more inclusively, to support accessibility by users with cognitive and\/or sensory disabilities. Microsoft [&hellip;]<\/p>\n","protected":false},"author":38022,"featured_media":579727,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":[{"type":"user_nicename","value":"Meredith Ringel Morris","user_id":"32884"}],"msr_hide_image_in_river":0,"footnotes":""},"categories":[194481],"tags":[],"research-area":[13554],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-impact-theme":[],"msr-promo-type":[],"msr-podcast-series":[],"class_list":["post-579130","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-human-centered-computing","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"","podcast_episode":"","msr_research_lab":[],"msr_impact_theme":[],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[283244],"related-projects":[573690],"related-events":[577950],"related-researchers":[],"msr_type":"Post","featured_image_thumbnail":"<img width=\"960\" height=\"540\" src=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788.png\" class=\"img-object-cover\" alt=\"\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788.png 1400w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-300x169.png 300w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-768x432.png 768w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-1024x576.png 1024w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-1066x600.png 1066w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-655x368.png 655w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/04\/Accessibility-Reader-View_Site_04_2019_1400x788-343x193.png 343w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","byline":"Meredith Ringel Morris","formattedDate":"April 18, 2019","formattedExcerpt":"At the ACM CHI Conference on Human Factors in Computing Systems conference in Glasgow, Scotland this May, researchers from Microsoft\u2019s Redmond and UK labs, together with our university collaborators, will be presenting several papers and demos that explore how to design technologies more inclusively, to&hellip;","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/posts\/579130","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/users\/38022"}],"replies":[{"embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/comments?post=579130"}],"version-history":[{"count":16,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/posts\/579130\/revisions"}],"predecessor-version":[{"id":579889,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/posts\/579130\/revisions\/579889"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/media\/579727"}],"wp:attachment":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/media?parent=579130"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/categories?post=579130"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/tags?post=579130"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=579130"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=579130"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=579130"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=579130"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=579130"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=579130"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=579130"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=579130"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}