An AI-generated video of Arizona politician Kari Lake brings home the risks that such deepfakes pose to elections.
[link removed]
There’s something scary online involving Kari Lake — and it’s not what you might expect.
The nonprofit journalism site Arizona Agenda has a minute-long video from the TV news anchor turned GOP candidate, praising the site’s work . . . and then halfway through, reveals that it is all a deepfake. Watch it here.
[link removed]
Especially, watch it on a phone, where the glitches are less noticeable. This is new, and unnerving, and ominous.
It is now less than two years since ChatGPT was released, and the world began to debate how much change advances in generative artificial intelligence tools would bring. Are they like Gutenberg’s Bible, made possible by the new technology of the printing press? Or are they yet another techno-fad, more hype than impact? Over the coming years, all this will unfold with massive repercussions on our work, healthcare, and lives. (A guarantee: The Briefing is written by a live person and always will be!)
When it comes to elections, it is becoming increasingly clear that the biggest new threat in 2024 comes from the impact of generative AI on the information ecosystem, including through deepfakes like the one starring “Kari Lake.” (The real Lake, meanwhile, sent a cease-and-desist letter to the website.) That risk is especially high when it comes to audio, which can be easier to manipulate than visual imagery — and harder to detect as fake.
Last year the Slovak presidential election
[link removed]
may have been tipped by fake audio of a leading candidate that went viral days before the vote. In New Hampshire, bogus robocalls
[link removed]
from “Joe Biden” urged voters to sit out the primary. In Chicago’s mayoral election
[link removed]
, a fake tape purported to feature a candidate musing, “In my day, no one would bat an eye” if a police officer killed 17 or 18 people. The risk of doctored audio and video makes it harder to know what is real. Donald Trump has taken to decrying any video that makes him look bad as fake.
At the Brennan Center, we worry especially about how all this might affect the nuts and bolts of election administration. Recently we held a “tabletop exercise” with Arizona Secretary of State Adrian Fontes, one of the country’s most effective public servants, and other election officials in the state. It featured a similar fake video starring Fontes, created for educational and training purposes. The verisimilitude was so unnerving that the recording was quickly locked away.
Here’s a scenario we tested out: You’re a local election official. It’s a hectic Election Day, and you get a call from the secretary of state. “There’s been a court order,” she says urgently. “You need to shut down an hour early.” When local workers receive a call like that, they should take a breath and call the secretary of state’s office back. You’ll find out quickly that the call was actually a deepfake. That’s the kind of simple process that could catch the fraud before it takes root.
Government can take other steps, too. We’ve laid out many of them in a series of essays
[link removed]
with Georgetown’s Center for Security and Emerging Technology. Often, officials need to take steps that would already make sense to protect against cyberthreats and other challenges.
There is more that needs to be done. One good step is to label AI-generated content as watermarked, making clear that AI was used to create or alter an image. Meta (aka Facebook) proudly unveiled such a system to label all content that was created with AI tools from major vendors such as OpenAI, Google, and Adobe. My colleague Larry Norden, working with a software expert, showed how easy it is to remove the watermarks from these images and circumvent Meta’s labeling scheme. It took less than 30 seconds.
[link removed]
So government will need to step up. Sen. Amy Klobuchar (D) of Minnesota, a leader on election security, is working with Sen. Lisa Murkowski (R) of Alaska and others to craft bills requiring campaign ads that make substantial use of generative AI to be labeled. That requires finesse, since courts will be wary of First Amendment issues. But it can be done. Such reform can’t happen fast enough.
After all, as the deepfake Kari Lake put it so well, “By the time the November election rolls around, you’ll hardly be able to tell the difference between reality and artificial intelligence.” That’s . . . intelligent.
Analyzing Minor Crimes in NYC
Misdemeanors, not felonies, dominate criminal justice. But incomplete national data makes it difficult to grasp the real scope of the minor offense system. A Brennan Center report hones in on New York City, drawing on court case data between 2016 and 2022 and discussions with stakeholders to understand trends in misdemeanor enforcement, the harms of the system, and better approaches to long-term public safety. “Addressing these complex problems requires looking beyond a simple binary of aggressive enforcement versus inaction,” Josephine Hahn, Ram Subramanian, and Tiffany Sanabia write. Read more
[link removed]
Oversight for DHS’s Use of AI
The Department of Homeland Security plans to incorporate generative artificial intelligence models for a wide range of activities, from combating human trafficking to training immigration officials. But DHS has a long history of violating Americans’ privacy and civil rights. “AI tools risk supercharging practices that trample on the rights of millions of Americans, and DHS lacks oversight and accountability mechanisms strong enough to stop such harm,” Spencer Reynolds writes. Read more
[link removed]
Trump’s Potential Violations of the Georgia Constitution
Last week, a Georgia judge dismissed six charges against Donald Trump and his codefendants in their prosecution for election interference, while leaving the rest of the case intact. The court found that the indictment lacked specificity regarding the allegations that the defendants solicited state officials to violate their oaths of office. Should the prosecutors seek to bring the charges again, they should rely on the Georgia Constitution because it “provides ample bases for establishing more specific charges,” Georgia University law professor Anthony Michael Kreis writes for State Court Report. Read more
[link removed]
Prosecution of Voter Fraud in Florida
Despite an amendment to the Florida Constitution restoring voting rights to most people with felony convictions, subsequent state legislation has caused mass confusion over who is now allowed to vote. And Gov. Ron DeSantis tapped the Office of Statewide Prosecution instead of local prosecutors to bring voter fraud charges against people who mistakenly believed their rights were restored and voted. A Florida appeals court is now considering whether the statewide prosecutors have authority to bring such cases. They do not, law professors Robert F. Williams and Quinn Yeargain argue in State Court Report. Read more
[link removed]
Coming Up
VIRTUAL EVENT: Misdemeanors by the Numbers
[link removed]
Thursday, April 11, 3–4 p.m. ET
A decade of reforms has shrunk the sprawling misdemeanor system, but the prosecution of shoplifting, traffic violations, and other lesser offenses remains a burden on vulnerable communities and law enforcement resources even as public concern over physical and social disorder in public spaces spurs call for renewed enforcement. A new Brennan Center report
[link removed]
zooms in on New York City as a case study for how misdemeanor enforcement has changed in recent years, offering insights into the impact of the Covid-19 pandemic and reform initiatives. Join report author Josephine Hahn, the MacArthur Foundation’s Bria L. Gillum, and Michigan county sheriff Jerry Clayton for a virtual discussion about this under-examined part of our criminal justice system. RSVP today
[link removed]
Produced with support from the John D. and Catherine T. MacArthur Foundation
VIRTUAL EVENT: The Failed Experiment of Mass Incarceration
[link removed]
Thursday, April 17, 3–4 p.m. ET
The United States has the highest incarceration rate in the world, a dubious distinction with grave social consequences. Excessive Punishment: How the Justice System Creates Mass Incarceration
[link removed]
, a new book edited by the Brennan Center’s Lauren-Brooke Eisen, explores the roots and social costs of mass incarceration, as well as reforms that would prioritize human dignity and restoration over retribution. Join us virtually for a live event moderated by Eisen to hear from several of the book’s contributors on why the U.S. criminal justice system is so punitive and what alternatives could rebalance it. RSVP today
[link removed]
Want to keep up with Brennan Center Live events? Subscribe to the events newsletter.
[link removed]
News
Sean Morales-Doyle on who confers the power to vote // USA TODAY
[link removed]
Faiza Patel on AI use by the government // AXIOS
[link removed]
Robyn Sanders on protecting elections from gun violence // Stateline
[link removed]
Dan Weiner on the role of money in the 2024 elections // NPR’s 1A
[link removed]
Feedback on this newsletter? Email us at
[email protected]
mailto:
[email protected]
[link removed]
Brennan Center for Justice at NYU School of Law
120 Broadway, Suite 1750 New York, NY 10271
646-292-8310
tel:646-292-8310
[email protected]
mailto:
[email protected]
Support Brennan Center
[link removed]
Want to change how you receive these emails or unsubscribe? Click here
[link removed]
to update your preferences.
[link removed]
[link removed]
[link removed]
[link removed]
[link removed]
[link removed]