The general view of Microsoft Corporation headquarters in Issy-les-Moulineaux, near Paris, France, April 18, 2016. REUTERS/Charles Platiau – D1AESZGHATAA
Microsoft’s facial recognition technology is getting smarter at recognizing people with dark skin.
On Tuesday, the company touted the progress, but it comes amid growing concern that these technologies will allow surveillance against people of color.
Microsoft has announced it will not broach the concern; the company merely focused on how facial-recognition tech could misidentify both men and women with dark skin. Microsoft has recently reduced the system’s error rates by up to 20 times.
In February, research from MIT and Stanford University highlights how facial-recognition technology can be built with a bias. The study found that Microsoft’s own system was 99% accurate when it came to the determination of the sex of lighter-colored people, but only 87% accurate for dark skin subjects.
More From PCmag
Galaxy Note 9 Expected in Aug. 9 Samsung Unpacked Event
Keep Your Eyes on the Road, Not Infotainment Systems
Facebook ‘Keyword Snooze’ The Sound Of Those Annoying Messages
Google Maps Makes it Easier to Explore Your Surroundings
For women with dark skin, the accuracy rate fell further, to 79 percent. The reason for this? The computer algorithms for the power of facial-recognition systems are trained by the scanning of thousands of different photos and learn to classify them. As the photos show mainly people with a skin type over another, the computer algorithms accidentally the development of a number of prejudices.
To remedy the problem, Microsoft has launched a new data collection effort to improve the training data for the facial-recognition system was built. The company also tweaked how the algorithms can classify people.
Microsoft tech is available as a tool for website and app developers to analyze photos and videos to determine what they contain. But in recent months, civil liberties groups have also been sounding the alarm about how facial-recognition systems are used by law enforcement.
The technology may help the police to identify criminal suspects in the photos or videos in the eyes of the man have missed. (Microsoft’s own AI-algorithms to be used by researchers to my criminal justice data.) However, critics claim that facial-recognition systems are also prone to making errors, and could be abused to discriminate against immigrants and activists.
A number of Microsoft’s own employees are also afraid. Last week, a group of them began to call for an end to a contract the company has with the U.S. immigration authorities that seems to result in Microsoft ‘ s facial-recognition-tech. However, Microsoft says the contract is only concerned with cloud services, such as e-mail, calendar, and messaging.
This article originally appeared on PCMag.com.