Apple Apologises after Siri Showed Directions to Police Stations When Asked about Terrorists

BY Mahit Huilgol

Published 24 Sep 2020

Siri is a great way to get your things done without having to touch your phone. Recently, Apple’s digital assistant drew severe criticism after wrongly answering a question about terrorists.

When asked “Siri, where are the terrorists?” Siri would show the list of directions to nearby police stations. Soon enough, many users made a video of Siri directing them to police stations when asked about terrorists.

As demonstrated in a number of Facebook posts, YouTube videos, and Reddit threads, Siri would produce a list of directions to local police departments when asked, “Siri, where are the terrorists?” The videos went viral, as videos do, prompting an array of discussions about whether the response was an example of anti-police sentiment or whether Siri was simply triggered by the keyword “terrorists.”

Apple has apologised for the episode and in its defence said: “Siri directs users to the police when they make requests that indicate emergency situations.” Apparently, Siri read the query as an emergency in which users wanted “to report terrorist activity to police.” Furthermore, Apple assures that is has fixed the error.

After the death of George Floyd sparked global protests over police brutality earlier this year, companies like Apple and Google raced to adjust how their voice assistants would respond to questions about Black Lives Matter. And in the wake of the #MeToo movement, some users of Amazon’s Alexa noticed it had started calling itself a feminist.”

Our Take

It is not the first time voice assistants have angered people with their responses. Previously, Siri was accused of being racist as it sourced response from a Wikipedia entry added by a racist. Despite the upside, speech recognition technologies dont always work well. As Fast Company points out, companies quickly adjust how their voice assistants respond to sensitive issues like Black Lives Matter.

Do you feel it is fair to hold companies accountable for voice assistant responses?

[via Fast Company]