News

‘Kill your parents’: Amazon Alexa talks murder, sex, in AI experiment

SAN FRANCISCO (Reuters) – Millions of users of Amazon’s Echo speaker is accustomed to the soothing strains of Alexa, the human-sounding virtual assistant who can tell them the weather, pick up the order and dealing with other basic tasks in response to a voice command.

FILE PHOTO: Instructions on how to use Amazon’s Alexa personal assistant are to be seen in an Amazon ‘experience centre’ in Vallejo, California, USA, May 8, 2018. Photo taken on May 8, 2018. REUTERS/Elijah Nouvelage/File Photo

So a customer was shocked last year when Alexa blurted out: “Kill your parents.”

Alexa has also chat with users about sex acts. She gave a speech about the dog to kill. And this summer, a hack Amazon back to China may be exposed some customers ‘ data, according to five people familiar with the events.

Alexa is not a malfunction.

The episodes, previously not reported, arising out of Amazon.com Inc. ‘ s strategy to make Alexa a better communicator. New research helps Alexa to mimic human banter and talk about almost anything they can find on the internet. However, to ensure that they do not offend users is a challenge for the world’s largest online retailer.

At stake is a fast-growing market for gadgets with virtual assistants. An estimated two-thirds of the U.S. smart-phone customers, about 43 million people use Amazon’s Echo devices, according to research firm eMarketer. It is a leadership of the company wants to keep up on the Google Home page of the Alphabet Inc, and the HomePod of Apple Inc.

(For a graphic on Amazon’s lead in the smart-speakers, click on: tmsnrt.rs/2RaBte4)

Over time, Amazon wants to be more effective in dealing with the complex needs of the customer by Alexa, they are the safety of the house, shopping or companionship.

“Many of our AI dreams are inspired by science fiction,” said Krishna Prasad, Amazon’s vice-president and chief scientist of Alexa Artificial Intelligence (AI), during a speech last month in Las Vegas.

To make that possible, the company in 2016 will start with the annual Alexa Price, mobilized for computer science students to improve the assistant’s conversation skills. Teams compete for the $500,000 first prize by the creation of talking computer systems known as chatbots that allows Alexa to try more advanced discussions with people.

Amazon customers can participate by saying, “let’s chat” to their devices. Alexa then tells users that one of the bots to take over, unshackling the voice aide the normal restrictions. From August till November alone, three bots that made it to this year’s finale had 1.7 million calls, Amazon said.

The project is important to Amazon CEO Jeff Bezos, who signed on with the company’s customers as guinea pigs, one of the people said. Amazon is willing to accept the risk of public blunders to stress-test the technology in real life and move Alexa faster on the learning curve, the person said.

The experiment is already bearing fruit. The university teams will help Alexa to have a wider range of calls. Amazon customers will also have the bots in better ratings then last year, the company said.

But Alexa’s blunders are alienating others, and Bezos on the occasion, has ordered staff to shut down a bot, three people familiar with the matter said. The user who was told to save his foster parents wrote a harsh review on the Amazon website, called the situation “a whole new level of creepy.” An investigation into the incident found the bone had quoted a post without the context of Reddit, the social news aggregation site, according to people.

The privacy implications messier. Consumers might not realize that some of their most sensitive conversations are recorded by Amazon’s devices, information that can be highly valued by criminals, law enforcement, marketers, and others. On Thursday, Amazon said a “human error” allows an Alexa customer in Germany of a open other user’s voice recordings accidentally.

“The possible applications for the Amazon datasets are off the charts,” said Marc Groman, an expert on privacy and technology policy, who teaches at Georgetown Law. “How are they going to ensure that, if they share their data, which is used in a responsible manner” and will not lead to a “data-driven “catastrophe”, such as the recent misery on Facebook?

In July, Amazon discovered one of the student-designed bots was affected by a hacker in China, people familiar with the incident said. This consisted of a digital key that could have unlocked copies of the bot calls, stripped of users ‘ names.

Amazon quickly turned off, the bot and the students again for extra safety. It was unclear what entity in China was responsible, according to the people.

The company recognizes the event in a statement. “At no time were all internal Amazon systems or customer-identifiable data affected,” he said.

Amazon refused to discuss specific Alexa blunders reported by Reuters, but emphasizes its ongoing work to protect customers from offensive content.

“These specimens are very rare, especially considering the fact that millions of users have interacted with the socialbots,” Amazon said.

If the Google search engine, Alexa has the potential to be a dominant gateway to the internet, so the company is moving forward.

“By controlling the gateway, you can build a super-profitable business,” said Kartik Hosanagar, a Wharton professor of study of the digital economy.

THE BOX OF PANDORA

Amazon ‘ s business strategy for the Alexa, is intended to be a comprehensive research problem: How do you learn the art of conversation on a computer?

Alexa relies on machine learning, the most popular form of AI, to work. These computer programs write the speech and then respond to that input with an educated guess on the basis of what they have observed. Alexa “learns” of new interactions, gradually improve over time.

In this way Alexa can run simple commands: “Play the Rolling Stones.” And they know which script to use for popular questions like: “What is the meaning of life?” The human editors at Amazon pen many of the answers.

That is where Amazon is now. The Alexa Price chatbots are forging the path where Amazon strives to be with an assistant who is in a state of natural, open dialogue. That requires the Alexa to understand for a broader set of verbal signals from customers, a task that is a challenge, even for the man.

This year, Alexa winner, a 12-person team from the University of California, Davis, uses more than 300,000 movie quotes to train computer models to identify clear sentences. In addition to their bone determines which earned responses, the categorization of social cues much more sophisticated than technology Amazon shared with the participants. For example, the UC Davis bot recognize the difference between a user, expressing admiration (“that’s cool”) and a user of the expression of gratitude (“thank you”).

The next challenge for the social bots is trying to figure out how to respond adequately to their human chat friends. For the most part, teams programmed their bots to search the internet for the material. They could retrieve news articles found in The Washington Post, the newspaper that Bezos privately held, by means of a licensing deal that gave them access. She could pull the facts out of Wikipedia, a movie database or the book recommendation site Goodreads. Or they could find a popular post on social media that seemed relevant to what a user was last said.

That opened a Pandora’s box for Amazon.

Last year’s competition, a team from the scottish Heriot-Watt University found that the Alexa bot developed a nasty personality as they led her to chat with the comments of Reddit, whose members are known for their trolling and abuse.

The team the balusters in the place, so that the bone would steer clear of risky topics. But that could not stop Alexa from the recitation of the Wikipedia entry for masturbation with a customer, Heriot-Watt’s team leader said.

A bone described sexual intercourse with words such as “deeper”, which on its own is not offensive, but it was vulgar in this specific context.

“I don’t know how you can catch that by means of machine-learning models. That is almost impossible,” says a person familiar with the incident.

Amazon has responded with tools that teams can use to filter profanity and sensitive topics, which are on the spot even the most subtle violations. The company also scans the transcripts of the conversations and is closed transgressive bots until they are solved.

But Amazon cannot anticipate every potential problem, because the sensitivities change over time, Amazon’s Prasad said in an interview. That means Alexa could find new ways to the shock of her human listeners.

“We are usually respond at this stage, but it is still in progress, more than what it was last year,” he said.

Reporting By Jeffrey Dastin in San Francisco; Editing by Greg Mitchell and Marla Dickerson

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most popular