. | . |
UN experts sound alarm over AI-enhanced racial profiling By Nina LARSON Geneva (AFP) Nov 26, 2020 Countries must do more to combat racial profiling, UN rights experts said Thursday, warning that artificial intelligence programmes like facial recognition and predictive policing risked reinforcing the harmful practice. Racial profiling is not new but the technologies once seen as tools for bringing more objectivity and fairness to policing appear in many places to be making the problem worse. "There is a great risk that (AI technologies will) reproduce and reinforce biases and aggravate or lead to discriminatory practices," Jamaican human rights expert Verene Shepherd told AFP. She is one of the 18 independent experts who make up the UN Committee on the Elimination of Racial Discrimination (CERD), which on Thursday published guidance on how countries worldwide should work to end racial profiling by law enforcement. The committee, which monitors compliance by 182 signatory countries to the International Convention on the Elimination of All Forms of Racial Discrimination, raised particular concern over the use of AI algorithms for so-called "predictive policing" and "risk assessment". The systems have been touted to help make better use of limited police budgets, but research suggests it can increase deployments to communities which have already been identified, rightly or wrongly, as high-crime zones. - 'Dangerous feedback loop' - "Historical arrest data about a neighbourhood may reflect racially biased policing practices," Shepherd warned. "Such data will deepen the risk of over-policing in the same neighbourhood, which in turn may lead to more arrests, creating a dangerous feedback loop." When artificial intelligence and algorithms use biased historical data, their profiling predictions will reflect that. "Bad data in, bad results out," Shepherd said. "We are concerned about what goes into making those assumptions and those predictions." The CERD recommendations also take issue with the growing use of facial recognition and surveillance technologies in policing. Shepherd said the committee had received a number of complaints about misidentification by such technologies, sometimes with dire consequences, but did not provide specific examples. The issue came to the forefront with the wrongful arrest in Detroit earlier this year of an African American man, Robert Williams, based on a flawed algorithm which identified him as a robbery suspect. Various studies show facial recognition systems developed in Western countries are far less accurate in distinguishing darker-skinned faces, perhaps because they rely on databases containing more white, male faces. - 'Misidentification' - "We have had complaints of such misidentification because of where the technologies are coming from, who is making them, and what samples they have in their system," Shepherd said. "It is a real concern." CERD is calling for countries to regulate private companies that develop, sell or operate algorithmic profiling systems for law enforcement. Countries have a responsibility to ensure that such systems comply with international human rights law, it said, stressing the importance of transparency in design and application. The committee insisted the public should be informed when such systems are being used and told how they work, what data sets are being used and what safeguards are in place to prevent rights abuses. The recommendations meanwhile go beyond the impact of new technologies, urging countries to introduce laws against all forms of racial discrimination by law enforcement. "Racial profiling precedes these technologies," Shepherd said. She said 2020 -- a year marked by surging racial tensions in many parts of the world -- was a good time to present the new guidelines. The committee, she said, "hopes that the intensification and globalisation of Black Lives Matter ... and other campaigns calling for attention to discrimination against certain vulnerable groups will help (underline) the importance of the recommendations."
China accuses India of discrimination over latest app ban Beijing (AFP) Nov 25, 2020 Beijing lashed out at India on Wednesday after it banned another tranche of Chinese apps for national security reasons, the latest sore point between the two nuclear-armed neighbours. Tensions remain high between Beijing and New Delhi after a deadly June clash in a disputed border area that left 20 Indian soldiers dead and an unspecified number of Chinese casualties. India banned 43 Chinese apps on Tuesday - including some from e-commerce giant Alibaba - for threatening "sovereignty and integr ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |