I have an iPhone 4S, and I’ve tried many times to use Siri. Siri is the intelligent software assistant that is designed to let you use your voice to send messages, make calls, and find directions among other things. But she’s more trouble than she’s worth, and usually is very little help to me. I’ve noticed that Siri makes a lot of mistakes and I have to constantly tell her to change the text message I’ve dictated to her. When she messes it up again, I tell her to cancel it and she snaps at me, “Fine, forget it then.” A bit rude for a robot, wouldn’t you say? However, when my boyfriend uses her she down right coos at him. In the two months I’ve had her, she has never once snapped at him. This lead me to think that Siri has alternative motives, or maybe it’s just that she doesn’t like me, but why? When I first got her, we had a lot of fun asking her inappropriate questions. She found us four escorts that were very close to our proximity, but told us that she would not perform any of those types of behaviors herself because she, “wasn’t that kind of personal assistant.” When I told her I felt like killing somebody she told me where I could dispose of the body, finding the nearest “dumps, swamps, mines, reservoirs or metal foundries.” She’ll even tell you the closest pharmacies to find Viagra, but when I asked her to help me find the Plan B Pill she said, “I don’t understand Plan B Pill.” To her credit, however, she did find the Morning After Pill. Yet, when I asked her to find an …
… abortion clinic, Siri said, “Sorry, I couldn’t find any abortion clinics.” Even though I live less than a mile from Planned Parenthood, Siri could not find me a pap smear, citing she didn’t understand pap smear, she couldn’t tell me where to get birth control, or a mammogram–she didn’t even offer me definitions.
Amadi, who identifies as a queer disabled socialist Jewish feminist woman of color, says Siri “will help our partners discard our bodies when we’re killed in a domestic violence incident, but can’t help us when we’re living victims.” Amadi came to that conclusion because she conducted her own test on amaditalks.tumblr.com. And even though I have my reservations about Siri’s accuracy, I think that Amadi is a bit harsh on Siri – she’s a toy, not a lifeline. Not a real personal assistant with a real personality or real ethics, but I do think the next smart phone with voice technology needs to be more sexually responsible and female-friendly.
As it is, however, we have to settle for Siri, and just like any first relationship, she’s a learning process. She doesn’t have everything we’re looking for in a smartphone assistant just yet, but maybe in time we’ll find our perfect match. As long as she quits trying to steal my boyfriend, FFS.