{"id":40967,"date":"2020-06-09T09:47:38","date_gmt":"2020-06-09T09:47:38","guid":{"rendered":"http:\/\/icloud.pe\/blog\/?guid=e20a6b36404cd7756016358c5794eeb2"},"modified":"2020-06-09T09:47:38","modified_gmt":"2020-06-09T09:47:38","slug":"ibm-to-kill-its-own-facial-recognition-technology","status":"publish","type":"post","link":"https:\/\/icloud.pe\/blog\/ibm-to-kill-its-own-facial-recognition-technology\/","title":{"rendered":"IBM to kill its own facial recognition technology"},"content":{"rendered":"<p><span class=\"field field-name-field-author field-type-node-reference field-label-hidden\"><br \/>\n      <span class=\"field-item even\"><a href=\"https:\/\/www.cloudpro.co.uk\/authors\/keumars-afifi-sabet-0\">Keumars Afifi-Sabet<\/a><\/span><br \/>\n  <\/span><\/p>\n<div class=\"field field-name-field-published-date field-type-datetime field-label-hidden\">\n<div class=\"field-items\">\n<div class=\"field-item even\"><span class=\"date-display-single\">9 Jun, 2020<\/span><\/div>\n<\/p><\/div>\n<\/div>\n<p class=\"short-teaser\">\n<a href=\"https:\/\/www.cloudpro.co.uk\/\" title=\"\" class=\"combined-link\"><\/a><\/p>\n<div class=\"field field-name-body\">\n<p><span data-cke-copybin-start=\"1\">\u200b<\/span>IBM has decided to &#8220;sunset&#8221; its general-purpose facial recognition and analysis software suite over ethical concerns following a fortnight of Black Lives Matter protests.<\/p>\n<p>Despite putting a lot of efforts into developing its AI-powered tools, the cloud giant will no longer distribute these systems for fear that it could be used for purposes that go against the company\u2019s principles of trust and transparency.\u00a0<\/p>\n<p><!--wysiwyg_see-related_plugin--><\/p>\n<p>\nSpecifically, there are concerns the technology could be used for mass surveillance, racial profiling and the violations of basic human rights and freedoms. This is in addition to the company now <a href=\"https:\/\/www.itpro.co.uk\/security\/privacy\/354627\/how-can-facial-recognition-be-made-safer\" >deploring the use of facial recognition in principle<\/a>, and by rival vendors, for such purposes.<\/p>\n<p>&#8220;We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies,\u201d CEO Arvind Krishna <a href=\"https:\/\/www.ibm.com\/blogs\/policy\/facial-recognition-susset-racial-justice-reforms\/\" >outlined in a letter to the US Congress<\/a>.<\/p>\n<p>&#8220;Artificial intelligence\u00a0is a powerful tool that can help law enforcement keep citizens safe. But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported.&#8221;<\/p>\n<p>The announcement represents a major shift, given the company has previously ploughed considerable money and effort into building out its capabilities, and has occasionally courted controversy in the process.<\/p>\n<p>In March 2019, for example, IBM was called out for <a href=\"https:\/\/www.itpro.co.uk\/technology\/33218\/ibm-used-flickr-photos-to-train-image-recognition-tech-without-user-consent\" >using almost a million photos from photo-sharing site Flickr<\/a> to train its facial recognition algorithms without the consent of the subjects. Those in the pictures weren\u2019t advised the firm was going to use their images\u00a0to help determine gender, race and other identifiable features, such as hair colour.<\/p>\n<p>Several months before that, the company was found to have been <a href=\"https:\/\/www.itpro.co.uk\/technology\/31865\/ibm-reportedly-used-nypd-surveillance-footage-for-object-identification-tech\" >secretly using video footage collected by the New York Police Department (NYPD)<\/a> to develop software that can identify individuals based on distinguishable characteristics.<\/p>\n<p>IBM had created a system that allowed officers to search for potential criminals based upon tags, including facial features, clothing colour, facial hair, skin colour, age, gender and more. Overall, it could identify more than 16,000 data points, rendering it extremely accurate in recognising faces.<\/p>\n<p>While the general use of facial recognition in law enforcement is not entirely uncommon, it has run into legal blockades, with jurisdictions, such as San Francisco, <a href=\"https:\/\/www.itpro.co.uk\/policy-legislation\/33649\/san-francisco-bans-facial-recognition-technology\" >banning its use altogether, for example<\/a>.<\/p>\n<p>Police forces in the UK, meanwhile, have been trialling such systems, but the <a href=\"https:\/\/www.cloudpro.co.uk\/cloud-essentials\/4719\/ico-launches-cloud-data-protection-guide\" >Information Commissioner\u2019s Office (ICO)<\/a> has effectively neutered these plans after urging branches to assess data protection risks and ensure there\u2019s no bias in the software being used.<\/p>\n<p>In addition to permanently withdrawing its facial recognition technology, IBM has called for a national policy that encourages the use of technology to bring greater transparency and accountability to policing. These may include body cameras and data analytics techniques.<\/p>\n<p>Much in step with IBM until now, a number of other major companies have engaged in developing their own AI-powered facial recognition capabilities which have often also courted controversy.\u00a0<\/p>\n<p>AWS has come under fire for building its highly sophisticated Rekognition technology with alleged racial and gender bias. The company\u2019s shareholders <a href=\"https:\/\/www.itpro.co.uk\/technology\/33717\/amazon-rejects-revolt-over-its-facial-recognition-technology\">overturned an internal revolt over the sale of Rekognition to the police<\/a> by an overwhelming majority of 97% in May 2019, for example.<\/p>\n<p>The claims were based on MIT research that found it <a href=\"https:\/\/www.itpro.co.uk\/technology\/32846\/mit-research-finds-ethnic-and-gender-bias-in-amazon-rekognition\" >mistakenly identified some pictures of woman as men 31% of the time<\/a>, which was more prevalent when it was shown pictures of darker-skinned women. This was against an error rate of 1.5% with Microsoft\u2019s software. <\/p>\n<\/p><\/div>\n","protected":false},"excerpt":{"rendered":"<p>      Keumars Afifi-Sabet<\/p>\n<p>        9 Jun, 2020    <\/p>\n<p>      \u200bIBM has decided to &#8220;sunset&#8221; its general-purpose facial recognition and analysis software suite over ethical concerns following a fortnight of Black Lives Matter protests.<br \/>\nDespite putting &#8230;<\/p>\n","protected":false},"author":433,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[],"tags":[],"class_list":["post-40967","post","type-post","status-publish","format-standard","hentry"],"_links":{"self":[{"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/posts\/40967","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/users\/433"}],"replies":[{"embeddable":true,"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/comments?post=40967"}],"version-history":[{"count":1,"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/posts\/40967\/revisions"}],"predecessor-version":[{"id":40968,"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/posts\/40967\/revisions\/40968"}],"wp:attachment":[{"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/media?parent=40967"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/categories?post=40967"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/icloud.pe\/blog\/wp-json\/wp\/v2\/tags?post=40967"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}