Apple's artificially intelligent personal assistant software, Siri, has come under fire by Chinese critics recently over concerns that the program could be used to find "distasteful services".
Since Siri's inception into the Chinese market with iOS 6 in August, Chinese netizens and prudes have complained that Siri has been too functional; now it appears they may have been correct. When prompted about sexual services such as finding a "massage parlor" Siri was able to pinpoint areas close to the user that offer special services. This feature of Siri has drawn the ire of Chinese parents and social watchdogs.
The issue is so large that even State media such as China Daily has commented on it (full disclosure: I work for China Daily). In an article that was published today, October 29, CD interviewed a Chinese lawyer about the legal implications this incident may have for Apple.
"We can only believe the information on Siri is accurate and actually points to sex services, this has not been confirmed by the police," said Wang Xing, a lawyer with a Beijing law firm. "However addresses on Siri have affected the public order and had a negative influence, in other words it has the potential of disobeying if the software cannot be improved."
According to a colleague of mine in Beijing (I am based in Beijing), Sam Zhang, Siri was able to pull up information on the nearest available location that offered sexual services. Zhang says that when he first heard of the news of Siri's "debauchery" he tried it and was shocked at how perverse his neighborhood was.
Prostitution is technically illegal in China but the law is often unenforced, leading to prostitution being part of the hidden norm. There are various locales where people searching for such services can find them and lingo dedicated to circumventing the law. Local Chinese would see a pink massage parlor and think nothing of it, but those seeking a little "action" would know better.
According to the CD and to Zhang, a simple search through Siri, "Search for Sex Services" would draw up 15 of the closest houses of burlesque.
However according to tests done about two hours before this post, it seems Siri is now filtering results. Here in Beijing, we (Zhang and myself) tried four variations on the phrase, "Search for Sex Services", everything from "looking for hookers" to looking for a "happy ending". We even tried the ever popular "Search for Chickens". Each time Siri couldn't find anything and directly jumped to Google web search. After the first web search, Google couldn't be reached. The Chinese internet is heavily censored and Google is prone to being blocked for hours over a bad search.
Chinese news portal Sina.com's reporter said that they were able to get results for sexual services over the weekend but Siri is now coming up blank on Monday. According to the Sina report, Apple said that Siri isn't something that can really be censored and users can set passwords and block certain words themselves; this however seems odd as Apple normally doesn't comment on such things. I've reached out to Apple and haven't gotten an answer on this issue as of this posting (via one text message and phone call).
Whatever the case may be, it looks like as of today China's Great Firewall has gotten the better of Siri. Who knows what they'll censor next? As of this posting, I can no longer find out the number of McDonald's restaurants around my office. The answer is 1; there's also one KFC and two Pizza Huts near me right now.
苹果回应Siri涉黄：不能进行关键词屏蔽 [Sina Tech]
Alarm Raised Over Apple App [China Daily]
Kotaku East is your slice of Asian internet culture, bringing you the latest talking points from Japan, Korea, China and beyond. Tune in every morning from 4am to 8am.