Recently, Amazon Ring announced it will soon add a troubling new feature to its line of home surveillance cameras and doorbells: facial recognition. This face recognition tool has the potential to violate the privacy rights of millions of people—and could result in Amazon breaking state biometric privacy laws.
The feature, called “Familiar Faces,” aims to identify specific people who come into view of the camera. When turned on, the feature will scan the faces of all people who approach the camera to try and find a match with a list of pre-saved faces. This will include many people who have not consented to a face scan, including friends and family, political canvassers, postal workers, delivery drivers, children selling cookies, or maybe even some people passing on the sidewalk.
Any Ring collection of biometric information in states that require opt-in consent poses huge legal risk for the company. Amazon already told reporters that the feature will not be available in Illinois and Texas—strongly suggesting its feature could not survive legal scrutiny there. The company said it is also avoiding Portland, Oregon, which has a biometric privacy law that similar companies have avoided.
As we write on our blog, your biometric data, such as your faceprint, are some of the most sensitive pieces of data that a company can collect. Associated risks include mass surveillance, data breach, and discrimination. Today’s feature to recognize your friend at your front door can easily be repurposed tomorrow for mass surveillance. Ring’s close partnership with police amplifies that threat.
But Illinois and Texas aren't the only states protecting this kind of data. And many biometric privacy laws across the country are clear: Companies need your affirmative consent before running face recognition on you. In at least one state, ordinary people with the help of attorneys can challenge Amazon’s data collection. Where this isn't possible, we explain, state privacy regulators should step in.
READ MORE…
📱APP GATEKEEPING: In the name of "safety," Google recently announced plans to make all Android app developers verify their identities. But why does Google need to see someone's driver's license to evaluate whether an app is safe? On our blog, we explain why this kind of gatekeeping really means more centralized control by both corporations and governments—entities that don't always have our best interests at heart.
🚔 BIASED POLICING: A new EFF investigation has found that more than 80 law enforcement agencies across the United States have used language perpetuating harmful stereotypes against Romani people when searching the nationwide Flock Safety automated license plate reader (ALPR) network. This data comes from audit logs obtained and analyzed by EFF, providing more evidence that audit logs and internal policies alone cannot prevent a surveillance system from becoming a tool for racist policing.
🫡 DEPARTMENT OF DEFERENCE: If you or I were spending millions of dollars on new technology, we would probably want some assurance it does what it's supposed to do. But, as we write on our blog, the U.S. Department of Defense seems to want less proof its software works. This year's defense spending bill includes changes that would reduce cost disclosures and testing requirements for military acquisitions of technology like AI, signaling one thing: speed over due diligence.
🎙 GATE CRASHING: Check out Gate Crashing, EFF's video series where we talk to people who have used the internet to take nontraditional paths to the very traditional worlds of journalism, creativity, and criticism. In our latest episode, "From DIY to Publishing," we're talking to Preeti Chhibber, a writer who comes from a place that many of us do: being a fan.