comscore
News

Don't ask Siri to define mother twice, unless you want to hear questionable language

The first answer is safe enough; it's the second one that will turn out to be either hilarious or fairly bothersome.

  • Published: May 2, 2018 1:03 PM IST
hey-siri-image

In what is likely the strangest Siri response yet (and probably a massive Easter Egg that just took a few years to discover), the iOS voice assistant for iPhones and iPads can get a bit, well, abusive. A Reddit user has posted on the Apple community of the popular social messaging and posting site that asking Siri for an additional definition of the word ‘mother’ elicits a strange and rather inappropriate response.

Asking Siri to define the word ‘mother’ is simple enough, and it fetches the standard Oxford Dictionary definition, which refers to a woman in relation to her child or children, or a female animal in relation to her offspring. According to the Reddit post, Siri then asks if you would like an additional definition for the word.

Saying ‘yes’ to this question fetches the second, more obscene definition, “As a noun, it means, short for ‘motherf****r’. We don’t have to type it out, but you know what we’re saying. We tested this at the time of writing, and it seems that the obscene response is no longer being provided.

While it might seem strange, it’s worth noting that Siri seems to be pulling its response in this case from information it’s getting on the Oxford Dictionary website. Down the list on the Oxford definition for ‘mother’, sure enough you’ll find the noun definition as mentioned above, listed under North American vulgar slang. Siri’s response in this case seems to be entirely accidental based on the site it’s pulling the information from, and can safely be put away as a comical error. The fact that it doesn’t seem to be showing up anymore means it’s likely been fixed by Apple.

WATCH: Apple iPhone X Review

The rise in use of virtual voice-based assistants on smartphones and connected devices such as Siri, Google Assistant and Amazon Alexa means there are plenty of fun ways like this to test the real-world capabilities of AI today. This one has turned out to be relatively harmless (except if Siri blurts out abusive language in front of your kids), and the fact that it’s been fixed already is a good sign that AI is continuously improving for the better.

  • Published Date: May 2, 2018 1:03 PM IST