{"id":585421,"date":"2019-05-10T09:01:28","date_gmt":"2019-05-10T16:01:28","guid":{"rendered":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/?p=585421"},"modified":"2023-03-21T16:42:01","modified_gmt":"2023-03-21T23:42:01","slug":"creating-ai-glass-boxes-open-sourcing-a-library-to-enable-intelligibility-in-machine-learning","status":"publish","type":"post","link":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/blog\/creating-ai-glass-boxes-open-sourcing-a-library-to-enable-intelligibility-in-machine-learning\/","title":{"rendered":"Creating AI glass boxes \u2013 Open sourcing a library to enable intelligibility in machine learning"},"content":{"rendered":"<p><a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-585586 size-large\" src=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-1024x576.png\" alt=\"woman in the city holding and looking at an opened laptop\" width=\"1024\" height=\"576\" srcset=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-1024x576.png 1024w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-300x169.png 300w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-768x432.png 768w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-1066x600.png 1066w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-655x368.png 655w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-343x193.png 343w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788.png 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><\/p>\n<p>When AI systems impact people\u2019s lives, it is critically important that people understand their behavior. By understanding their behavior, data scientists can properly debug their models. If able to reason how models behave, designers can convey that information to end users. If doctors, judges and other decision makers trust the models that underpin intelligent systems, they can make better decisions. More broadly, with fuller understanding of models, end users might more readily accept the products and solutions powered by AI, while growing regulator demands might be more easily satisfied.<\/p>\n<p>In practice, achieving intelligibility can be complex and highly dependent on a host of variables and human factors, precluding anything resembling a \u201cone-size-fits-all\u201d approach. Intelligibility is an area of cutting-edge, interdisciplinary research, building on ideas from machine learning, psychology, human-computer interaction, and design.<\/p>\n<p><a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/project\/intelligible-interpretable-and-transparent-machine-learning\/\">Researchers at Microsoft<\/a> have been working on how to create intelligible AI for years, and we are extremely excited to announce today that we are <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/github.com\/Microsoft\/interpret\">open sourcing under MIT license a software toolkit<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> \u2013 <em>lnterpretML<\/em> \u2013 that will enable developers to experiment with a variety of methods for explaining models and systems. <em>InterpretML<\/em> implements a number of intelligible models\u2014including Explainable Boosting Machine (an improvement over generalized additive models ), and several methods for generating explanations of the behavior of black-box models or their individual predictions.<\/p>\n<p>By having an easy way to access many intelligibility methods, developers will be able to compare and contrast the explanations produced by different methods, and to select methods that best suit their needs. Such comparisons can also help data scientists understand how much to trust the explanations by checking for consistency between methods.<\/p>\n<p>We are looking forward to engaging with the open-source community in continuing to develop <em>InterpretML<\/em>. If you are interested, we warmly invite you to <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/github.com\/Microsoft\/interpret\">join us on GitHub<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>When AI systems impact people\u2019s lives, it is critically important that people understand their behavior. By understanding their behavior, data scientists can properly debug their models. If able to reason how models behave, designers can convey that information to end users. If doctors, judges and other decision makers trust the models that underpin intelligent systems, [&hellip;]<\/p>\n","protected":false},"author":38022,"featured_media":585586,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":null,"msr_hide_image_in_river":0,"footnotes":""},"categories":[194467],"tags":[],"research-area":[13556],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-impact-theme":[],"msr-promo-type":[],"msr-podcast-series":[],"class_list":["post-585421","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artifical-intelligence","msr-research-area-artificial-intelligence","msr-locale-en_us"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"","podcast_episode":"","msr_research_lab":[],"msr_impact_theme":[],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[],"related-projects":[393287],"related-events":[],"related-researchers":[{"type":"user_nicename","value":"Harsha Nori","user_id":41461,"display_name":"Harsha Nori","author_link":"<a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/people\/hanori\/\" aria-label=\"Visit the profile page for Harsha Nori\">Harsha Nori<\/a>","is_active":false,"last_first":"Nori, Harsha","people_section":0,"alias":"hanori"},{"type":"guest","value":"samuel-jenkins","user_id":"586120","display_name":"Samuel  Jenkins","author_link":"<a href=\"https:\/\/www.linkedin.com\/in\/jenkinsjsamuel\" aria-label=\"Visit the profile page for Samuel  Jenkins\">Samuel  Jenkins<\/a>","is_active":true,"last_first":"Jenkins, Samuel ","people_section":0,"alias":"samuel-jenkins"},{"type":"user_nicename","value":"Paul Koch","user_id":33207,"display_name":"Paul Koch","author_link":"<a href=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/people\/paulkoch\/\" aria-label=\"Visit the profile page for Paul Koch\">Paul Koch<\/a>","is_active":false,"last_first":"Koch, Paul","people_section":0,"alias":"paulkoch"},{"type":"guest","value":"ester-de-nicolas","user_id":"568668","display_name":"Ester de Nicolas","author_link":"Ester de Nicolas","is_active":true,"last_first":"de Nicolas, Ester","people_section":0,"alias":"ester-de-nicolas"}],"msr_type":"Post","featured_image_thumbnail":"<img width=\"960\" height=\"540\" src=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788.png\" class=\"img-object-cover\" alt=\"a man standing in front of a building\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788.png 1400w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-300x169.png 300w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-768x432.png 768w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-1024x576.png 1024w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-1066x600.png 1066w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-655x368.png 655w, https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/05\/OSS-Release-Of-Our-Interpretability-Library_Site_1400x788-343x193.png 343w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","byline":"","formattedDate":"May 10, 2019","formattedExcerpt":"When AI systems impact people\u2019s lives, it is critically important that people understand their behavior. By understanding their behavior, data scientists can properly debug their models. If able to reason how models behave, designers can convey that information to end users. If doctors, judges and&hellip;","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/posts\/585421","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/users\/38022"}],"replies":[{"embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/comments?post=585421"}],"version-history":[{"count":10,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/posts\/585421\/revisions"}],"predecessor-version":[{"id":929016,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/posts\/585421\/revisions\/929016"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/media\/585586"}],"wp:attachment":[{"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/media?parent=585421"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/categories?post=585421"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/tags?post=585421"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=585421"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=585421"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=585421"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=585421"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=585421"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=585421"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=585421"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/cm-edgetun.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=585421"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}