Healthcare workers hit Amazon Alexa with proposed class-action over failing to notify them of covert recording policy


Healthcare workers across the U.S. have filed a proposed class-action against Amazon.com, alleging the Seattle-based e-commerce giant failed to warn users that its Alexa smart speakers are always eavesdropping, leading those providers to potentially violate patient privacy law.

The suit, filed in the U.S. Western District Court of Washington on Wednesday, comes on behalf of a customer service employee, a substance abuse counselor, an individual in the psychiatric field and a healthcare company employee, who “communicate and work with HIPAA-protected information that Alexa may have captured without [their] intent.” These four individuals allege that Alexa’s policy of listening and recording conversations, and Amazon’s failure to notify users of this practice, violates state and federal wiretapping and consumer protection laws. The plaintiffs’ attorneys claim to represent a class of tens of millions of customers who purchased Alexa devices within the past four years. Amazon did not immediately respond to an interview request.

“Not only does this covert recording, storing and analyzing of consumer information entail a profound violation of privacy, but given the increasingly all-encompassing scope of Alexa Devices’ ability to interface with aspects of a consumer’s life (e.g., controlling home security features such as locks and lights, and providing access to personal medical or identity-related information), Amazon’s storage of the recordings—unnecessary to the functionality of Alexa Devices—creates a risk from hacking or other unauthorized leveraging of consumer data and processes by third parties (or Amazon personnel),” the lawsuit says.

In 2020, Amazon sold more than 200 million Alexa devices, according to the suit. When these smart speakers hear “wake words,” they automatically attempt to perform requested actions and begin recording audio. These recordings are immediately uploaded to Amazon’s Alexa Cloud storage system, where company developers use the data to fine-tune AI products or sell it to third-parties, the plaintiffs contend.

But Alexa’s technology is not perfect. The plaintiffs cite a 2020 Northeastern University study that found smart speakers, including Alexa, begin recording even though no “wake word” was spoken up to 19 times per day. The complaint also cites a Science Focus magazine article published in March 2020, which found Alexa responded to 1,000 words that are not supposed to trigger the device, such as “election.”

Although the company is aware its smart speakers are eavesdropping on users, “Amazon has taken no remedial action to address the interception of the communications of plaintiffs and the putative class, but rather has sought to continue and expand it,” the suit says.

The plaintiffs want Amazon to stop recording their conversations without consent, and they want the company to reimburse their attorney fees, pre- and post-judgment interest, damages, and any other relief the court deems just and proper. The healthcare worker plaintiffs are seeking a jury trial.

The lawsuit comes as more health insurers seek to build integrations with Amazon.

In May, UPMC unveiled a feature that allows Amazon Alexa users to access basic plan information through their smart speakers and the Pittsburgh-based insurer says that no individual information is accessible to Amazon when policyholders use the tool. Blue Cross and Blue Shield of Michigan offers a similar service, but the not-for-profit insurer said it only provides basic plan data and nothing that would identify users.

Anthem, which operates Blue Cross and Blue Shield plans in 14 states, also has an Alexa tool. The Amazon device only listens in on active requests and has no way to identify who asked the question, the company said on its website”


Source: modernhealthcare.com

Liked Liked