From Campaign for Accountability <[email protected]>
Subject CfA Newsletter - March 15
Date March 15, 2024 6:14 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
X Sanctions Report Prompts Letter from Lawmakers

View this email in your browser ([link removed])


** CfA's March 15, 2024 Newsletter
------------------------------------------------------------
With your support, Campaign for Accountability is working to expose corruption and hold the powerful accountable.


** This Week's Updates:
------------------------------------------------------------
TTP Research on X Sanctions Enforcement Prompts Letter from Lawmakers
On Monday, Reps. Jamie Raskin and Dan Goldman sent a letter ([link removed]) to House Oversight and Accountability Chairman James Comer, urging him to hold a hearing regarding X’s apparent failure to comply with U.S. sanctions. The letter drew from a February report ([link removed]) published by CfA’s Tech Transparency Project (TTP), which identified over two dozen X Premium accounts linked to sanctioned entities, including terrorist leaders and state-run media outlets from countries like Russia and Iran. X may also have allowed the accounts to profit from its revenue-sharing program, in addition to providing them with special services in exchange for a monthly fee.

Instead of meaningfully addressing these concerns, X published a statement ([link removed]) touting its screening processes and challenging aspects of TTP’s research. The lawmakers described X’s response as “insufficient and alarming,” and pointed out that the company had previously failed to address the circulation of Hamas propaganda videos, which TTP covered in an earlier report ([link removed]) . Hours after the letter was sent to Rep. Comer, X owner Elon Musk announced ([link removed]) that the company’s “Trust and Safety” team was having its name shortened to “Safety.” Musk went on to claim that the Safety team’s goal was to simply “ensure compliance” with existing laws – which would presumably include U.S. sanctions.
Kansas Lawmakers Push Funding for Crisis Pregnancy Centers
This week, Kansas lawmakers on the Committee for State and Federal Affairs debated the merits of HB 2809 ([link removed]) , a bill which would direct $5.8 million ([link removed]) in taxpayer funds to crisis pregnancy centers (CPCs). Unlike real reproductive health clinics, CPCs do not provide access to abortions or quality care; the American College of Obstetricians and Gynecologists recommends ([link removed]) that patients avoid these facilities, pointing out that they are “unregulated and often nonmedical.” The CPC funding bill is part of a broader strategy for anti-abortion groups in Kansas, which faced a resounding defeat ([link removed]) when
voters declined to remove abortion protections from the state’s constitution. Instead of directly attacking reproductive rights, Republican lawmakers are advancing legislation that would make it more difficult or expensive for women to access abortions – including a bill that would force doctors to question women ([link removed]) about their choice to terminate a pregnancy. Ultimately, women’s answers would be reported to the state, along with a host of demographic information which is already collected.

In 2020, CfA urged Pennsylvania officials to pull state funding from an organization called Real Alternatives, which channeled money to CPCs that were forbidden to even discuss contraception ([link removed]) with patients. Pennsylvania Gov. Josh Shapiro (D) eventually terminated the contract ([link removed]) with Real Alternatives in 2023, cutting it off from taxpayer dollars. The Kansas CPC funding bill would divert money ([link removed]) from the state’s Temporary Assistance to Needy Families program, meaning that other initiatives could face cuts.
Breaking Down Proposals for Nonconsensual Deepfake Regulation
The technology to create nonconsensual deepfakes has existed for years, but a new wave of AI tools released without safeguards has made it easy for anybody (including children ([link removed]) ) to generate deceptive and sexually explicit images of other people. To address this problem, the House Committee on Oversight and Accountability held a hearing ([link removed]) to explore different regulatory approaches – some of which would be far friendlier to AI companies, online platforms, and even perpetrators.

Unsurprisingly, industry advocates favor minimal regulation. Carl Szabo, who served as a witness for the tech trade association NetChoice, cautioned ([link removed]) against “innovation-chilling” laws and claimed that existing regulations were mostly sufficient to address AI harms. Immediately after Szabo’s comments, U.C. Irvine law professor Dr. Ari Ezra Waldman remarked ([link removed]) that calls to enforce current laws were “the last bastion for those who want a deregulatory agenda,” and that Congress needed to pass legislation that would explicitly deter deepfake pornography. NetChoice has endorsed ([link removed]) two anti-deepfake bills authored by the American Legislative Exchange Council (ALEC), which serves as a resource for conservative policymakers. Notably, the bills would criminalize
([link removed]) the production of deepfake child sexual abuse material (CSAM) while merely creating civil penalties ([link removed]) for the production of nonconsensual deepfakes. Most state revenge pornography laws ([link removed]) , in contrast, establish criminal penalties for circulating real images without the victim’s consent. In his testimony, Professor Waldman that civil penalties were a “step forward” but ultimately insufficient, because they put the burden of enforcement ([link removed]) on victims – a position that subcommittee Chairwoman Nancy Mace strongly agreed ([link removed]) with.

Hearing Highlight
John Shehan, who serves as the Vice President of the National Center for Missing and Exploited Children (NCMEC), told lawmakers that most generative AI startups aren’t taking basic steps to prevent users from creating AI CSAM. According to Shehan, the well-established company Stability AI isn’t even registered to submit reports to NCMEC’s CyberTipline, despite being partnered with Amazon Web Services ([link removed]) and valued at over one billion dollars ([link removed]) .
What We're Reading
The FTC and DOJ think McDonald’s ice cream machines should be legal to fix ([link removed])
Missouri law bars divorce during pregnancy – even in cases of violence ([link removed])
Election Deniers Skirted Campaign Finance Laws in Wisconsin ([link removed])


** Follow Our Work:
------------------------------------------------------------
We thank you for your continued support. Without people like you, our work would not be possible.

Here is how you can stay involved and help us accomplish our mission:
1. Follow CfA on Threads ([link removed]) and BlueSky ([link removed])
2. Follow the Tech Transparency Project on Threads ([link removed]) and Bluesky ([link removed])
3. Tell your friends and colleagues ([link removed]) about CfA.
4. Send us a tip ([link removed]) .
5. Make a tax-deductible donation ([link removed]) .

Be on the lookout for more updates about our work in the upcoming weeks. Thanks again for signing up to be a part of CfA!

Sincerely,

Michelle Kuppersmith
Executive Director, Campaign for Accountability

============================================================

** Website ([link removed])

Copyright © 2024 Campaign for Accountability, All rights reserved.
You signed up for this list at campaignforaccountability.org

Our mailing address is:
Campaign for Accountability
611 Pennsylvania Ave SE
#337
Washington, District Of Columbia 20003
USA
Want to change how you receive these emails?
You can ** update your preferences ([link removed])
or ** unsubscribe from this list ([link removed])
.
Email Marketing Powered by Mailchimp
[link removed]
Screenshot of the email generated on import

Message Analysis