View this email in your browser

CfA's October 20, 2023 Newsletter

With your support, Campaign for Accountability is working to expose corruption and hold the powerful accountable.

This Week's Updates: 

CfA Submits Comment on Campaign Deepfakes 
Thanks to open-source AI software, it’s easier than ever for individuals (or political campaigns) to create convincing deepfakes of candidates. Similar technology has already been used by the DeSantis campaign to create still images of President Trump embracing Dr. Anthony Fauci, which were quickly dismissed as false. While the Federal Election Commission agreed to at least accept public comments around deepfakes, multiple commissioners have expressed clear hesitancy—claiming that they do not have jurisdiction over this type of fraud. 
 
But they do indeed have jurisdiction, as Public Citizen argued in its petition for rulemaking. In a comment concurring with that petition, sent to the Commission on Monday, CfA Executive Director Michelle Kuppersmith urged the unconvinced commissioners to reconsider. Her comment outlines how the skeptical commissioners are clearly failing to consider the deepfake’s ability to speak “on behalf” of other candidates—which unequivocally falls within the Commissioners’ jurisdiction to hold people accountable for “fraudulent misrepresentation.”
 
Combined with a separate FEC rulemaking that exempted “promoted” online videos from needing “Paid for” disclosures, a campaign could hypothetically upload a deepfaked video of candidate saying something damaging and then pay to “boost” it across social media platforms—all without adding any “paid for” disclaimer. If the FEC declines to act, this type of super-charged misinformation could become an acceptable practice in federal elections.
How Platform Changes Weakened X Ahead of Israel-Hamas Conflict 
Twitter’s old content moderation regime wasn’t perfect, but the events of the past two weeks have shown that certain safeguards existed for a reason, and that Elon Musk’s X is struggling without them. Beginning with paid verification and the loss of blue checks as form of authentication, the platform became flooded with accounts who could simply pay to have their replies prioritized by its algorithms—while simultaneously stripping a marker that was once used to prevent impersonation and fraud. Around the same time, Musk granted “amnesty” to accounts that had been suspended for infractions like hate speech, targeted harassment, and calls to violence. Now, as the conflict between Hamas and Israel unfolds, these policies have enabled bad actors to impersonate journalists and use X Premium accounts to spread misinformation—which they may be incentivized to do through the platform’s revenue-sharing program. At the same time, layoffs to X’s trust and safety teams have left it with fewer staff to respond to these issues or test new features. For instance, a joint TTP and ADL review found that X was placing advertisements in searches for well-known neo-Nazi groups, while platforms like Meta’s Facebook largely blocked the searches. Viewed as a whole, these cumulative changes have reshaped the flow of information on X and tilted the scales to favor bad actors who would have struggled to gain a foothold on the old Twitter. 
Lawmakers Turn up Heat on Crypto-Financed Terrorism 
Last week, the Wall Street Journal reported that the Israeli government had tracked large cryptocurrency transactions to digital wallets associated with groups like Hamas, Palestinian Islamic Jihad, and Hezbollah – all of which are designated as foreign terrorist organizations by the U.S. government. Now, a bipartisan coalition of lawmakers led by Sens. Sherrod Brown (D-OH) and Elizabeth Warren (D-MA) are calling on the Biden administration to crack down on terrorist financing enabled by cryptocurrency exchanges. Some companies with ties to international crime have already been shut down, like the Hong Kong-registered Bitzlato, but others have endured; the Russia-based Garantex, for instance, simply moved its operations to an undisclosed location and continued to facilitate transactions on behalf of terrorist organizations and cybercriminals. A day after the lawmakers sent their letter, the Treasury Departments Financial Crimes Enforcement Network (FinCEN) signaled that it would be narrowing restrictions on a practice known as Convertible Virtual Currency Mixing (CVC mixing), which allows illicit actors to obfuscate their cryptocurrency transactions. According to the notice of proposed rulemaking, FinCEN has identified CVC mixing as a “primary money laundering concern,” meaning that additional recordkeeping and reporting requirements will be imposed on these transactions. 
What We're Reading
FTC urged to investigate anonymous messaging app 
Transparency legislation stalls in Michigan … again
Leonard Leo tied group registers to lobby

Follow Our Work:


We thank you for your continued support.  Without people like you, our work would not be possible.

Here is how you can stay involved and help us accomplish our mission:
  1. Follow CfA on Twitter.
  2. Follow the Tech Transparency Project on Twitter.
  3. Tell your friends and colleagues about CfA. 
  4. Send us a tip
  5. Make a tax-deductible donation.
Be on the lookout for more updates about our work in the upcoming weeks. Thanks again for signing up to be a part of CfA!  
 
Sincerely, 

Michelle Kuppersmith
Executive Director, Campaign for Accountability
Twitter
Website
Copyright © 2023 Campaign for Accountability, All rights reserved.
You signed up for this list at campaignforaccountability.org

Our mailing address is:
Campaign for Accountability
611 Pennsylvania Ave SE
#337
Washington, District Of Columbia 20003

Add us to your address book


Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp