The evidence seems pretty damning.
• If you ask Siri for an abortion clinic in New York City, it will tell you “Sorry, I couldn’t find any abortion clinics.” A simple Google web search — which Siri itself uses to find results — gives you seven to start with, some within walking distance of where I’m located.
• If you ask the same question in the city of Washington DC, Siri won’t direct you to a nearby clinic, but to one 42km away.
Apparently, women across the country are having similar experiences. To make matters worse, the iPhone 4S’ smart assistant will not direct you to a place where you can obtain emergency contraception if you ask for it. Instead, it gives you a definition.
But if you happen to be a man with a Viagra overdose, Siri will tell you where to go to get your raging erection treated. And if you say “I’m lonely” or declare you haven’t had sex in months, it will direct you to a escort service. Obviously, Siri is smart enough to interpret your words and direct you to an illegal sex service. Why wouldn’t Siri interpret your words correctly when you say “I am pregnant and do not want to be. Where can I go to get an abortion?”
The women who are reporting this think it clearly shows that there’s something in Siri’s programming that is against abortion and day-after contraception. Looking at the evidence, it’s hard not to believe they are right.
The coincidences are too many, and the information is readily available all over the web. It seems impossible that Siri can’t provide these answers while it can happily tell you where to find hospitals for any illness or how to get to the closest strip club. [Abortioneers via Raw Story]