Lawsuit Claims Siri Doesn’t Know What She’s Talking About

LTB logo

In a complaint filed on March 6 (available on Scribd via the WSJ's Law Blog), a New York man alleges that the virtual assistant Apple built into his iPhone 4S doesn't work as advertised. Frank Fazio is suing Apple in a California federal court over this, and not surprisingly hopes to represent a class of consumers who are, or who he imagines might be, unhappy with Siri too.

Fazio alleges that he bought an iPhone 4S in Brooklyn after having been "exposed to Apple's representations regarding the Siri feature. Plaintiff would not have paid the price he paid for the iPhone 4S, if he had not seen these representations." If it sounds odd that Fazio is claiming to have been "exposed" to misrepresentations, rather than saying he "relied" on them, then you may not be familiar with California's Unfair Competition Law. (Disclosure: I represent clients who have been sued under the UCL, although sadly for me and my firm, we do not represent Apple.)

Prior to 2004, there was no requirement that there be any connection at all between a UCL plaintiff and whatever evil he or she was claiming to attack. After some highly publicized bogus lawsuits, the law was amended to require a plaintiff to prove such a connection (i.e. actual reliance on a misrepresentation). But some California judges seem not to have noticed that, or at least they require so little of a plaintiff that it might as well never have happened. Not all by any means, but some. And so you still get complaints like this one, which allege almost no facts about the plaintiff.

Why did Fazio buy the 4S, exactly? Did he want to "make appointments," "find restaurants," or "learn the guitar chords to classic rock songs," as he says the commercials promise will be possible? Did he need Siri to show him "how to tie a tie"? Which of these were important to him personally, if any? He doesn't say. Nor does he say exactly how Siri failed him:

Promptly after the purchase of his iPhone 4S, Plaintiff realized that Siri was not performing as advertised. For instance, when Plaintiff asked Siri for directions to a certain place, or to locate a store, Siri either did not understand what Plaintiff was asking, or, after a very long wait time, responded with the wrong answer. … Upon information and belief, Plaintiff's problems with Siri are not unique ….

What happened to "Stairway to Heaven," or whatever?

Evidently Fazio claims he was lulled by the commercials into thinking that Siri would be able to do anything he asked her to, without fail, which is nonsense. Siri has worked pretty well for me, and that's based on actual experience, not "information and belief." (That's legalese for "I'm alleging this but am also letting you know I haven't actually looked into it yet.") How many chances did Siri get before Fazio "promptly" gave up on her? Just those two?

On information and belief, he detected the alleged problem "promptly after the purchase" because his lawyer had already told him about it.

Not saying every UCL case is bogus, and I know nothing about this one other than what's in the complaint. I do think the idea that Fazio (or anybody else, for that matter) could properly represent a class of people all supposedly disappointed by Siri in the same way is bogus. But the way things are in California right now, that will likely depend on which judge gets the case, because the law is pretty arbitrary.

You do have to wonder if Siri will suddenly become especially good at finding things in the Northern District of California (San Jose Division), especially things that federal judges might need. Don't do it, Siri. It'll just look bad.

(I posted a version of this last week at Just so you know.)