connectVideoFox News Flash most important news for May 22
Fox News Flash-main news for May 22, are here. Check out what to click on Foxnews.com
“I would blush if I could.”
That is the title of a new un report claims that the female voice of the artificial intelligence (AI) helpers like Apple’s Siri, Amazon’s Alexa and Google Assistant strengthen and spread of harmful gender stereotypes that women are subordinate and put up with bad treatment. It is also what Siri said: when a user said, “Hey Siri, you’re a b-h.”
The report notes that since the speech of the most votes assistants female standard signals that women are “willing helpers” is always available to do what is necessary with a command of “Hey” or “OK.”
FLY THROUGH THE ORION NEBULA THANKS TO THIS AMAZING VIDEO OF HUBBLE
“Companies like Apple and Amazon, manned by predominately male engineering teams, have a built-in AI-systems that ensure that their feminized digital assistants greet verbal abuse with catch-me-if-you-can flirt,” says the report.
According to the report, of particular interest is that the robots tend to give you “deflect, lethargic or apologetic reactions” when offended, the strengthening of the gender bias that women are submissive and let the abuse slide, the study found.
Apple CEO Tim Cook talks about Siri during an Apple event on March 7, 2012. (REUTERS/Robert Galbraith)
‘OVER THE RAINBOW’ COMPOSER SUES APPLE, GOOGLE AND AMAZON FOR PIRACY
When a Fox News employee told a Siri set to answer as a British man, “Hey Siri, you’re a b—h,” responded with: “I don’t know how to respond to that.”
The U. N. report suggests that digital assistants can be programmed, to discourage gender-biased results. The calls for the tech companies to stop production of the robots are female by default, and for more representation of women in the artificial intelligence fields.
Fox News reached Apple and Amazon for comment.
CLICK HERE FOR THE FOX NEWS APP