Sci-fi nightmare unfolds as Alexa encourages 10-year-old to do something dangerous

i robot

It sounds like the beginning scene of a new A24 sci-fi/horror movie mashup. A young girl asks her household digital assistant for something to do to alleviate her boredom and is given an experiment that could be fatal. However, that’s just what happened in real life this weekend when Alexa, Amazon‘s voice-activated virtual assistant, essentially told a 10-year-old girl how to electrocute herself.

In a tweet posted Sunday, the girl’s mother, Kristin Livdahl, showed a screenshot of their Alexa smartphone app which described the “challenge”. The activity log read “The challenge is simple: plug in a phone charger about halfway into the wall outlet and touch a penny to the exposed prongs”

The Alexa AI apparently gleaned the challenge from the online news site Our Community Now. According to OCN, the challenge came from an article entitled  Watch Out, Parents—the Viral ‘Outlet Challenge’ Has Kids Doing the Unthinkable! which warned parents about a viral challenge associated with the hashtag #OutletChallenge that was making the rounds on social media in January 2020.

Livdahl told CNBC that she and her daughter, “were doing some physical challenges, like laying down and rolling over holding a shoe on your foot, from a [physical education] teacher on YouTube earlier, he just wanted another one.” Thankfully, “I was right there when it happened, and we had another good conversation about not trusting anything from the internet or Alexa.”

It goes without saying that no one should attempt to insert a coin into an outlet because of the risk of harmful shocks and electrical fires. Fire departments have warned that the “Penny Challenge” can also result in serious injury or death. The Alexa suggestion appears to be a result of a lack of “common sense” that is especially difficult for any current artificial intelligence to grasp.

In a tweet posted this weekend, AI expert Gary Marcus stated that “Our focus should be on figuring on how to build AI that can represent and reason about *values*, rather than simply perpetuating past data.”

Marcus later explained to CNBC that, “No current AI is remotely close to understanding the everyday physical or psychological world. What we have now is an approximation to intelligence, not the real thing, and as such it will never really be trustworthy. We are going to need some fundamental advances — not just more data — before we can get to AI we can trust.”

Amazon has told Business Insider that it has fixed the issue, stating, “Customer trust is at the center of everything we do and Alexa is designed to provide accurate, relevant, and helpful information to customers. As soon as we became aware of this error, we quickly fixed it, and are taking steps to help prevent something similar from happening again.”