Michael Keller ran a bunch of misspelled words through an iPhone simulator to see which topics Apple doesn’t autocorrect:
Our analysis found over 14,000 words that are recognized as words when spelled accurately, but that won’t be corrected even when they are only slightly misspelled. However, the vast majorityof these words are technical or very rarely used words: “nephrotoxin,” “sempstress,” “sheepshank,” or “Aesopian,” to name a few.
But among this list as well are more frequently used (and sensitive) words such as “abortion,” “abort,” “rape,” “bullet,” “ammo,” “drunken,” “drunkard,” “abduct,” “arouse,” “Aryan,” “murder,” and “virginity.”
We often look to technology to make our lives easier—to suggest restaurants, say, or to improve things as simple as our typing. But as more and more of our speech passes through mobile devices, how often is software coming between us and the words we want to use? Or rather, when does our software quietly choose not to help us? And who draws the line?
A commenter makes an obvious counterpoint:
I’m not the engineer in charge of this, but if I *WERE*, I’d probably have been told that my boss didn’t want to see users complain about spellchecks turning innocent typos into offensive words. The best way to do that would be to mark the word as something that the dictionary wouldn’t complain about, but that would never feature in those “Damn You, Spellcheck!” websites or tweets.
If that’s the case in reality (and it certainly seems likely enough), it’s not censorship and it’s not squeamishness. It’s Apple helping the majority of users from having little mistakes escalate into major embarrassments.