{"id":306,"date":"2018-04-01T13:35:12","date_gmt":"2018-04-01T13:35:12","guid":{"rendered":"http:\/\/sigai.acm.org\/aimatters\/blog\/?p=306"},"modified":"2018-04-01T13:35:12","modified_gmt":"2018-04-01T13:35:12","slug":"facebook-google-and-bias","status":"publish","type":"post","link":"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/","title":{"rendered":"FaceBook, Google, and Bias"},"content":{"rendered":"<p>Current events involving FaceBook and the use of data they collect and analyze relate to issues addressed by SIGAI and USACM working groups on algorithmic accountability, transparency, and bias. The players in this area of ethics and policy include those who are unaware of the issues and ones who intentionally use methods and systems with bias to achieve organizational goals. The issues around use of customer data in ways that are not transparent, or difficult to discover, not only have negative impacts on individuals and society, but they also are difficult to address because they are integral to business models upon which companies are based.<\/p>\n<p>A <em>Forbes<\/em> <a href=\"https:\/\/www.forbes.com\/sites\/parmyolson\/2018\/03\/13\/google-deepmind-ai-machine-learning-bias\/#27df64c36829\">recent article<\/a> \u201cGoogle&#8217;s DeepMind Has An Idea For Stopping Biased AI\u201d discusses research that addresses AI systems that spread prejudices that humans have about race and gender \u2013 the issue that when artificial intelligence is trained with biased data, \u00a0biased decisions may be made. An example cited in the article include facial recognition systems shown to have\u00a0difficulty properly recognizing black women.<\/p>\n<p>Machine-learning software is rapidly becoming widely accessible to developers across the world, many of whom are not aware of the dangers of using data contain biases. \u00a0The <em>Forbes<\/em> piece discusses an article \u201cPath-Specific Counterfactual Fairness,\u201d by DeepMind researchers Silvia Chiappa and Thomas Gillam. Counterfactual fairness\u00a0refers to methods of decision-making for machines and ways that fairness might automatically be determined. DeepMind has\u00a0a new division, <a href=\"https:\/\/deepmind.com\/applied\/deepmind-ethics-society\/\">DeepMind Ethics &amp; Society<\/a> that addresses this and other issues on the ethical and social impacts of AI technology.<\/p>\n<p>The Forbes article quotes Kriti Sharma, a consultant in artificial intelligence with Sage, the\u00a0British\u00a0enterprise\u00a0software company as follows:\u00a0&#8220;Understanding the risk of bias in AI is not a problem that that technologists can solve in a vacuum. We need collaboration between experts in anthropology, law, policy makers, business leaders to address the questions emerging technology will continue to ask of us. It is exciting to see increased academic research activity in AI fairness and accountability over the last 18 months, but in truth we aren&#8217;t seeing enough business leaders, companies applying AI, those who will eventually make AI mainstream in every aspect of our lives, take the same level of responsibility to create unbiased AI.&#8221;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Current events involving FaceBook and the use of data they collect and analyze relate to issues addressed by SIGAI and USACM working groups on algorithmic accountability, transparency, and bias. The players in this area of ethics and policy include those who are unaware of the issues and ones who intentionally use methods and systems with [&hellip;]<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","inline_featured_image":false,"footnotes":""},"categories":[12],"tags":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v24.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>FaceBook, Google, and Bias - ACM SIGAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"FaceBook, Google, and Bias - ACM SIGAI\" \/>\n<meta property=\"og:description\" content=\"Current events involving FaceBook and the use of data they collect and analyze relate to issues addressed by SIGAI and USACM working groups on algorithmic accountability, transparency, and bias. The players in this area of ethics and policy include those who are unaware of the issues and ones who intentionally use methods and systems with [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/\" \/>\n<meta property=\"og:site_name\" content=\"ACM SIGAI\" \/>\n<meta property=\"article:published_time\" content=\"2018-04-01T13:35:12+00:00\" \/>\n<meta name=\"author\" content=\"Larry Medsker\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Larry Medsker\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"2 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/\",\"url\":\"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/\",\"name\":\"FaceBook, Google, and Bias - ACM SIGAI\",\"isPartOf\":{\"@id\":\"https:\/\/sigai.acm.org\/main\/#website\"},\"datePublished\":\"2018-04-01T13:35:12+00:00\",\"author\":{\"@id\":\"https:\/\/sigai.acm.org\/main\/#\/schema\/person\/5097a3e1c76f2c205fe0f5ebb9b51fdb\"},\"breadcrumb\":{\"@id\":\"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/sigai.acm.org\/main\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"FaceBook, Google, and Bias\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/sigai.acm.org\/main\/#website\",\"url\":\"https:\/\/sigai.acm.org\/main\/\",\"name\":\"ACM SIGAI\",\"description\":\"ACM Special Interest Group on Artificial Intelligence\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/sigai.acm.org\/main\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/sigai.acm.org\/main\/#\/schema\/person\/5097a3e1c76f2c205fe0f5ebb9b51fdb\",\"name\":\"Larry Medsker\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/sigai.acm.org\/main\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/a175bde07d4c8846a16bc64afa6e97f1?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/a175bde07d4c8846a16bc64afa6e97f1?s=96&d=mm&r=g\",\"caption\":\"Larry Medsker\"},\"url\":\"https:\/\/sigai.acm.org\/main\/author\/larrym\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"FaceBook, Google, and Bias - ACM SIGAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/","og_locale":"en_US","og_type":"article","og_title":"FaceBook, Google, and Bias - ACM SIGAI","og_description":"Current events involving FaceBook and the use of data they collect and analyze relate to issues addressed by SIGAI and USACM working groups on algorithmic accountability, transparency, and bias. The players in this area of ethics and policy include those who are unaware of the issues and ones who intentionally use methods and systems with [&hellip;]","og_url":"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/","og_site_name":"ACM SIGAI","article_published_time":"2018-04-01T13:35:12+00:00","author":"Larry Medsker","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Larry Medsker","Est. reading time":"2 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/","url":"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/","name":"FaceBook, Google, and Bias - ACM SIGAI","isPartOf":{"@id":"https:\/\/sigai.acm.org\/main\/#website"},"datePublished":"2018-04-01T13:35:12+00:00","author":{"@id":"https:\/\/sigai.acm.org\/main\/#\/schema\/person\/5097a3e1c76f2c205fe0f5ebb9b51fdb"},"breadcrumb":{"@id":"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/sigai.acm.org\/main\/2018\/04\/01\/facebook-google-and-bias\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/sigai.acm.org\/main\/"},{"@type":"ListItem","position":2,"name":"FaceBook, Google, and Bias"}]},{"@type":"WebSite","@id":"https:\/\/sigai.acm.org\/main\/#website","url":"https:\/\/sigai.acm.org\/main\/","name":"ACM SIGAI","description":"ACM Special Interest Group on Artificial Intelligence","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/sigai.acm.org\/main\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/sigai.acm.org\/main\/#\/schema\/person\/5097a3e1c76f2c205fe0f5ebb9b51fdb","name":"Larry Medsker","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/sigai.acm.org\/main\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/a175bde07d4c8846a16bc64afa6e97f1?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/a175bde07d4c8846a16bc64afa6e97f1?s=96&d=mm&r=g","caption":"Larry Medsker"},"url":"https:\/\/sigai.acm.org\/main\/author\/larrym\/"}]}},"_links":{"self":[{"href":"https:\/\/sigai.acm.org\/main\/wp-json\/wp\/v2\/posts\/306"}],"collection":[{"href":"https:\/\/sigai.acm.org\/main\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sigai.acm.org\/main\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sigai.acm.org\/main\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/sigai.acm.org\/main\/wp-json\/wp\/v2\/comments?post=306"}],"version-history":[{"count":0,"href":"https:\/\/sigai.acm.org\/main\/wp-json\/wp\/v2\/posts\/306\/revisions"}],"wp:attachment":[{"href":"https:\/\/sigai.acm.org\/main\/wp-json\/wp\/v2\/media?parent=306"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sigai.acm.org\/main\/wp-json\/wp\/v2\/categories?post=306"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sigai.acm.org\/main\/wp-json\/wp\/v2\/tags?post=306"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}