{"id":2425,"date":"2021-11-11T13:51:48","date_gmt":"2021-11-11T13:51:48","guid":{"rendered":"https:\/\/symposium.org\/?p=2425"},"modified":"2021-12-02T07:57:12","modified_gmt":"2021-12-02T07:57:12","slug":"tech-is-too-white-and-thats-a-problem","status":"publish","type":"post","link":"https:\/\/symposium.org\/tech-is-too-white-and-thats-a-problem\/","title":{"rendered":"Tech is too white. And that&#8217;s a problem"},"content":{"rendered":"\n<p data-block-type=\"core\">Google News\u2019 algorithms associated \u201che\u201d with \u201cdoctor\u201d and \u201cshe\u201d with \u201cnurse.\u201d Microsoft\u2019s AI chatbot Tay pledged allegiance to Hitler within hours of being online. COMPAS, a risk-assessment programme, predicted black defendants were more likely to commit further crimes than they actually were.<\/p>\n\n\n\n<p data-block-type=\"core\">Artificial intelligence software is typically coded by white, young and privileged men \u2013 with consequences in terms of how they learn and function. But the bias carried by these algorithms may only be the tip of the iceberg when it comes to tech\u2019s impact on society. \u201cArtificial intelligence can do good,\u201d says Ayesha Khanna, CEO of ADDO AI. \u201cIt can reduce disease, it can democratise access to infrastructure for the poor, but tech is a double-edged sword. And unless we manage it carefully, it could also do harm.\u201d<\/p>\n\n\n\n<p data-block-type=\"core\">AI algorithms have repeatedly been racist, sexist and, well, biased. The problem is, the industry itself does not even know what lies under the surface. \u201cIn the world of AI, it is common knowledge that there are potential issues and pitfalls with the technology,\u201d says Heather Evans, advisor in advanced technologies for the Ministry of Economic&nbsp; Development and Growth of Ontario, Canada. \u201cBut there is not yet a good understanding of what these broad issues are.\u201d<\/p>\n\n\n\n<p data-block-type=\"core\">However, awareness is growing, and there are solutions out there regarding biased algorithms. \u201cIt is never too late! These codes are written by human beings. A lot of these biases come from poor data. You have to add more data, diversify the data, and retrain the model,\u201d says Khanna.<\/p>\n\n\n\n<p data-block-type=\"core\">Khanna can also imagine AI looking after itself eventually. \u201cAI could be programmed to inspect other algorithms as they evolve and get fed more data.\u201d<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\" data-block-type=\"core\"><img loading=\"lazy\" decoding=\"async\" width=\"960\" height=\"1440\" src=\"https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/IMG_0101-Heather-Evans-3-tsc.jpg\" alt=\"\" class=\"wp-image-2427\" srcset=\"https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/IMG_0101-Heather-Evans-3-tsc.jpg 960w, https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/IMG_0101-Heather-Evans-3-tsc-200x300.jpg 200w, https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/IMG_0101-Heather-Evans-3-tsc-683x1024.jpg 683w, https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/IMG_0101-Heather-Evans-3-tsc-768x1152.jpg 768w, https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/IMG_0101-Heather-Evans-3-tsc-1024x1536.jpg 1024w, https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/IMG_0101-Heather-Evans-3-tsc-1365x2048.jpg 1365w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/><figcaption>Heather Evans, Photographer:&nbsp;Lukas Rapp,  Tobias Schreiner<\/figcaption><\/figure>\n\n\n\n<h5 class=\"wp-block-heading\" data-block-type=\"core\">A diverse team: the win-win situation<br><\/h5>\n\n\n\n<p data-block-type=\"core\">A diverse company culture not only helps create products that are fit for a broader audience, but also products that last longer. \u201cIf the objective of a team is to create a product which delivers a service, then your customers are probably a diverse group of people,\u201d Evans says. \u201cYou need to understand them, and it is very hard to understand the perspective of someone whose life experience is so different than your own.\u201d<\/p>\n\n\n\n<p data-block-type=\"core\">So why does the tech industry have such a hard time creating truly diverse workplaces? Some insiders blame the pipeline. \u201cIn engineering, there are definitely fewer female candidates with minority backgrounds who have a wide range and depth of experience,\u201d explains Pavan Kumar, co-founder &amp; CTO of Cocoon Cam, which develops smart monitors to screen babies while they are sleeping. Kumar knows from first-hand experience how hard it is to create diversity in a start-up; 90 percent of the applications he gets are from men. What, then, is the best way to hire people from different backgrounds, ages, genders, and perspectives?<\/p>\n\n\n\n<p data-block-type=\"core\">According to Zabeen Hirji, advisor on the future of work at Deloitte, the answer lies in changing the human resources department. \u201cWhen you are going to hire from&nbsp; universities, you should take care to attract a diverse group of students,\u201d she says. \u201cAnd that means that the people you send on campus recruiting visits should actually be diverse.\u201d<\/p>\n\n\n\n<p data-block-type=\"core\">Khanna, meanwhile, argues that the solution lies in broadening the company\u2019s reach. \u201cI look increasingly at hiring digital talents, people who work remotely,\u201d she says. \u201cThe moment I expanded my horizons, both in terms of geographical boundaries and whether I was hiring someone full time, part time, or as a consultant, my pool of talents got much bigger, and there was a higher chance that I found diverse talents.\u201d<\/p>\n\n\n\n<p data-block-type=\"core\">Just hiring a diverse team is not enough: Companies must also include everyone in the debate. \u201cAs a leader, you want to empower your team so that they can have an opinion, so that they are heard,\u201d says Evans. \u201cBecause it is one thing for someone to have a comment, and it is another thing to be taken seriously.\u201d<\/p>\n\n\n\n<p data-block-type=\"core\">Everyone, in other words, is responsible for creating representative AI. \u201cWe have to demand transparency and accountability with our algorithms,\u201d says Khanna. \u201cWe cannot be passive: We have to force and compel ourselves as human beings, but also the companies and the governments, to provide that sort of accountability.\u201d<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\" data-block-type=\"core\"><img loading=\"lazy\" decoding=\"async\" width=\"1009\" height=\"749\" src=\"https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/Test-infographic-page-51.jpg\" alt=\"\" class=\"wp-image-2428\" srcset=\"https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/Test-infographic-page-51.jpg 1009w, https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/Test-infographic-page-51-300x223.jpg 300w, https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/Test-infographic-page-51-768x570.jpg 768w\" sizes=\"auto, (max-width: 1009px) 100vw, 1009px\" \/><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>Google News\u2019 algorithms associated \u201che\u201d with \u201cdoctor\u201d and \u201cshe\u201d with \u201cnurse.\u201d Microsoft\u2019s AI chatbot Tay pledged allegiance to Hitler within hours of being online. COMPAS, a risk-assessment programme, predicted black defendants were more likely to commit further crimes than they actually were. Artificial intelligence software is typically coded by white, young and privileged men \u2013 [&hellip;]<\/p>\n","protected":false},"author":22,"featured_media":2426,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_gspb_post_css":"","footnotes":""},"categories":[13],"tags":[],"ppma_author":[92],"class_list":["post-2425","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-insights"],"blocksy_meta":{"styles_descriptor":{"styles":{"desktop":"","tablet":"","mobile":""},"google_fonts":[],"version":6}},"acf":[],"featured_image_urls_v2":{"full":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar.jpg",1110,758,false],"thumbnail":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar-150x150.jpg",150,150,true],"medium":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar-300x205.jpg",300,205,true],"medium_large":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar-768x524.jpg",768,524,true],"large":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar-1024x699.jpg",1024,699,true],"xl":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar.jpg",1110,758,false],"xxl":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar.jpg",1110,758,false],"xxxl":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar.jpg",1110,758,false],"xxxxl":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar.jpg",1110,758,false],"xxxxxl":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar.jpg",1110,758,false],"1536x1536":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar.jpg",1110,758,false],"2048x2048":["https:\/\/symposium.org\/wp-content\/uploads\/2021\/11\/SYMP-49-Magazine-Web-Article-L-Kumar.jpg",1110,758,false]},"post_excerpt_stackable_v2":"<p>Google News\u2019 algorithms associated \u201che\u201d with \u201cdoctor\u201d and \u201cshe\u201d with \u201cnurse.\u201d Microsoft\u2019s AI chatbot Tay pledged allegiance to Hitler within hours of being online. COMPAS, a risk-assessment programme, predicted black defendants were more likely to commit further crimes than they actually were. Artificial intelligence software is typically coded by white, young and privileged men \u2013 with consequences in terms of how they learn and function. But the bias carried by these algorithms may only be the tip of the iceberg when it comes to tech\u2019s impact on society. \u201cArtificial intelligence can do good,\u201d says Ayesha Khanna, CEO of ADDO AI.&hellip;<\/p>\n","category_list_v2":"<a href=\"https:\/\/symposium.org\/category\/insights\/\" rel=\"category tag\">INSIGHTS<\/a>","author_info_v2":{"name":"wordpress@weitblick-online.ch","url":"https:\/\/symposium.org\/author\/wordpressweitblick-online-ch\/"},"comments_num_v2":"0 comments","authors":[{"term_id":92,"user_id":0,"is_guest":1,"slug":"laurianne-croteau-christine-haas","display_name":"Laurianne Croteau, Christine Haas","avatar_url":"https:\/\/symposium.org\/wp-content\/uploads\/gravatars\/762b22de4bf1bf3924204e9b02554eaa","0":null,"1":"","2":"","3":"","4":"","5":"","6":"","7":"","8":""}],"_links":{"self":[{"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/posts\/2425","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/users\/22"}],"replies":[{"embeddable":true,"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/comments?post=2425"}],"version-history":[{"count":2,"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/posts\/2425\/revisions"}],"predecessor-version":[{"id":3033,"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/posts\/2425\/revisions\/3033"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/media\/2426"}],"wp:attachment":[{"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/media?parent=2425"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/categories?post=2425"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/tags?post=2425"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/symposium.org\/wp-json\/wp\/v2\/ppma_author?post=2425"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}