Want $20 for every Apple device you may have tapped? The Cupertino-based tech giant is willing to spend up to $95 million to settle claims that Siri was a little too eager to listen to private conversations.
Apple has not admitted wrongdoing, but agreed to compensate US users up to $100 per household, according to court documents filed on December 13, 2024 in Oakland, California.
The settlement addresses allegations that the voice assistant recorded conversations without the wake word “Hey Siri” and may have stored and shared that data with advertisers – something Apple has denied on previous occasions.
Here’s a quick guide to claiming your share of the $95 million settlement.
How to get paid
To qualify for the settlement, you must be a U.S. resident and own one or more eligible devices between September 17, 2014 and December 31, 2024. The procedure requires that you submit a claim by May 15, 2025 and verify it under oath. that Siri has activated without your permission.
The website where you can submit your claim is currently inactive. Users must keep an eye on the news and refer to the official page to be eligible once the site becomes active. It should be ready within 45 days.
The settlement covers a wide range of Apple devices, including iPhone 6 and newer models, iPads released since 2014, all generations of the Apple Watch, the HomePod and HomePod Mini, as well as MacBooks and iMacs produced since 2014.
Under the settlement terms, users can receive $20 per eligible device, with a maximum payout of $100 per household for up to five devices. The final payment may increase if fewer claims are filed than expected. The legal team representing the plaintiffs will receive approximately $30 million from the settlement fund.
The claims process will begin with the launch of the official settlement website, expected in February 2025. Users will need to collect their device serial numbers or proof of purchase in advance. Once the site launches, claimants can complete the online form, submit requested documentation, select their preferred payment method and submit their claim before the May 15 deadline.
Hey Siri, stop listening
The lawsuit comes from a 2019 exposé by The Guardianwhich revealed that Apple contractors regularly accessed private Siri recordings. According to the claims, contractors reported hearing medical appointments, business deals and intimate moments – and also allegedly sharing them with advertisers.
Lead plaintiff Fumiko Lopez’s experience highlights the potential violation of privacy. As reported by the BBC, Shortly after discussing Air Jordan shoes at home, she and her daughter saw targeted ads for the exact models they mentioned. Another claimant reported seeing advertisements for specific medical treatments shortly after discussing them with his doctor.
“Apple has at all times denied and continues to deny any alleged wrongdoing or liability,” the lawsuit said. The company states that Siri data collection is only to improve the service and remains anonymized.
In addition to the $95 million payment, the settlement also requires Apple to confirm the permanent deletion of all Siri audio recordings collected before October 2019.
This settlement comes amid growing concerns about AI-powered voice assistants and AI in general. Similar lawsuits have been leveled against other tech giants, with Google also facing a parallel class action lawsuit in California.
“Plaintiffs in the lawsuit allege that Google Assistant can activate and record communications even if a user does not intentionally activate Google Assistant with a hot word, such as “Ok Google,” or manually activate Google Assistant on their device,” the official site said for the lawsuit. class action lawsuit reads.
Amazon agreed in 2023 to pay $25 million for similar privacy violations related to its Alexa devices, with the SEC statement noting that its “complaint alleges that Amazon retained children’s voice recordings indefinitely by default” in contrary to a law.
Of course, all of these companies have previously claimed to respect and protect the privacy of their users. This is especially important considering that they are all developing their own generative AI models to improve their user experience, and this requires a lot of data.
If you want to be extra careful and protect your privacy, you can prevent Siri from activating automatically, or stop using AI assistants altogether. Not ideal, but that’s the world we live in.
Edited by Andrew Hayward
Generally intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.