Apple Settles Claim for Siri Eavesdropping

Is Your Tech Listening? Apple Settles Claim for Siri Eavesdropping

The lawsuit, Lopez v. Apple, dates back to July 2019, when the Guardian published the allegations of an anonymous whistleblower—an Apple subcontractor whose job was to listen to Siri recordings to determine if the voice-activated assistant was being correctly triggered. The whistleblower claimed that accidental Siri activations routinely captured sensitive audio. Despite Apple’s promises that Siri listens only when invited, background noises (often just the sound of a zipper, according to the whistleblower) could switch it on. The contractor said user location and contact information accompanied recordings.

Apple had never explicitly told users that humans might review their Siri requests, and within a week of the Guardian report, the company halted the program. The first Lopez v. Apple complaint was filed in August 2019, and two weeks later Apple issued a public apology in which it promised to make human review opt-in-only and to stop retaining audio by default. That apology was framed to allay customer concerns—not as an admission of wrongdoing. Apple denied all allegations in the lawsuit, which is common in class-action settlements in U.S. courts.

If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

If the situation sounds familiar, your memory works. In 2018 Amazon’s Alexa recorded a married couple’s conversation about hardwood floors and sent it to one of the husband’s employees. Amazon blamed an unlikely chain of misheard cues—basically, it came down to Alexa butt-dialing someone with living room chatter. The following year Bloomberg reported that Amazon had thousands of workers transcribing clips to fine-tune the assistant. Later Google faced similar allegations. The pattern was clear: robots needed to be trained to make sure that they were hearing voice commands correctly, and this training needed to come from humans who, in the process, inevitably heard things they shouldn’t via consumer gadgets. Even TVs were implicated: in 2015 Samsung warned owners not to discuss secrets near its smart sets because voice commands were sent to unnamed third parties, a disclaimer that could have been written by George Orwell.

This isn’t tin-foil-hat territory. A 2019 survey found that 55 percent of Americans believe their phones listen to them to collect data for targeted ads, and a 2023 poll pushed the number north of 60 percent. In the U.K., a 2021 poll found two thirds of adults had noticed an ad that they felt was tied to a recent real-life chat. But psychologists say this perception of “conversation-related ad creep” often relies on a feedback loop driven by confirmation bias: we ignore the thousands of ads that form a constant backdrop to our lives but build a campfire legend from the one time we mentioned “fire,” and an app tried to sell us tiki torches. The result is a low-grade cultural fear, with people placing masking tape on device mics and TikTokers begging Siri to stop stalking them. Knowing how ravenous tech companies are for data, people can hardly be blamed for this attitude.

As for Apple, which once put “What happens on your iPhone, stays on your iPhone” on a Las Vegas billboard, the settlement doesn’t force it to admit fault—but lands a dent in its titanium halo: If the Cupertino, Calif.–based company can’t keep a lid on hot-mic moments, who can?

(Asked for comment by Scientific American, Apple shared information on the settlement and emphasized its commitment to privacy. And Amazon reiterated its commitment to privacy, writing, “Access to internal services is highly controlled, and is only granted to a limited number of employees who require these services to train and improve the service.” Samsung and Google had not responded to requests for comment by the time of publication.)

Deni Ellis Béchard is Scientific American’s senior tech reporter. He is author of 10 books and has received a Commonwealth Writers’ Prize, a Midwest Book Award and a Nautilus Book Award for investigative journalism. He holds two master’s degrees in literature, as well as a master’s degree in biology from Harvard University. His most recent novel, We Are Dreams in the Eternal Machine, explores the ways that artificial intelligence could transform humanity. You can follow him on X, Instagram and Bluesky @denibechard.