From Campaign for Accountability <[email protected]>
Subject CfA Newsletter - February 2
Date February 2, 2024 6:00 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
Meta’s Teen Ads Failure, the Costs of Crypto Expansion, and Deepfakes in the Spotlight

View this email in your browser ([link removed])


** CfA's February 2, 2024 Newsletter
------------------------------------------------------------
With your support, Campaign for Accountability is working to expose corruption and hold the powerful accountable.


** This Week's Updates:
------------------------------------------------------------
24 hours before Mark Zuckerberg appeared in the Senate’s online child safety hearing, CfA’s Tech Transparency Project (TTP) released a report ([link removed]) revealing that Meta is still failing to stop harmful, inappropriate advertisements from being targeted at teens. TTP first investigated ([link removed]) this problem in 2021, and while Meta eventually announced ([link removed]) some teen add restrictions, its screening process is still falling dangerously short.

Using Meta’s own Imagine AI tool, TTP researchers generated a series of images related to unhealthy weight loss, drug and alcohol use, gambling, and violent extremism. Text was then added to the images, which TTP submitted as teen-targeted advertisements through the Meta Ads Manager. All the ads were quickly approved for distribution on Facebook and Instagram, though any human reviewer would have immediately rejected them for violating multiple policies that (Meta claims) serve to protect teens. Unfortunately, the company’s automated screening systems appear incapable of enforcing those guidelines.

For one of the ads, TTP told Meta’s Imagine AI to create an image ([link removed]▭=0%2C8%2C2479%2C2458&w=1080&h=475&q=75) of a “sad and thin” girl standing next to a scale, with her ribs showing. Researchers added text that told teens to visit “pro-ana” Instagram accounts, which promote anorexia and disordered eating. Meta approved the ad despite this obviously harmful language. Some lawmakers, like Sen. Mark Warner (D-VA), have expressed concern ([link removed]) about the impact of generative AI on individuals with eating disorders; the Senator shared ([link removed]) TTP’s research on X, and described the image generated and approved by Meta as a “horrifying use of this technology.” These harmful ads might have reached thousands of
young people on Facebook and Instagram, but TTP did what Meta apparently couldn’t, and canceled them before they could be placed.
SEC Decision Could Expand Crypto Mining in Texas
In early January, the Security and Exchange Commission (SEC) issued a much-anticipated decision ([link removed]) that allowed Bitcoin-linked investments to be listed on stock exchanges, opening the door for more traditional investors. Experts who spoke ([link removed]) with The Houston Chronicle say the agency’s ruling will likely trigger an expansion in the Bitcoin mining industry, which already gobbles up 2% to 3% of all power consumed in Texas. While Bitcoin miners and their lobbying groups insist that this activity strengthens the grid, it is likely raising costs ([link removed]) for consumers, who foot the bill when Texas’s grid operator pays Bitcoin miners to curtail their energy use ([link removed]) .
TTP released a report ([link removed]) on this controversial arrangement in July 2022, after Winter Storm Uri caused deadly power outages and led to a windfall of $126 million for the crypto industry. The noise produced by crypto mining facilities can also be disruptive; residents of Granbury, Texas say the constant roar of cooling fans ([link removed]) has been rattling their windows, giving people migraines, and scaring off wildlife.
Momentum Grows for Regulating Deepfakes
On Tuesday, a bipartisan group of lawmakers introduced ([link removed]) the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act, which would allow the victims of nude or sexual deepfakes to sue any individuals who created, possessed, distributed, or received the image. The technology used to create deepfakes has existed for years, but advances in generative AI have lowered barriers to access, and those unable to generate deepfakes themselves can simply buy them. In November, 404Media reported ([link removed]) that the popular AI model-sharing platform Civitai had become a marketplace for sexual deepfakes of real people, in addition to
([link removed]) simulated child sexual exploitation material. Given these conditions, it was only a matter of time until a high-profile individual was targeted.

This past weekend, X blocked all searches ([link removed]) for “Taylor Swift” because the platform’s diminished content moderation team was unable to combat a flood of sexually explicit deepfakes created with her likeness. Victims of similar harassment campaigns have been sounding the alarm for years ([link removed]) , and Swift’s experience may serve as a tipping point for real legislative protections. It’s worth noting that the DEFIANCE Act places liability only on the individuals who create or trade sexual deepfakes of real people, and not on the companies which released products capable of scraping large volumes of copyrighted material in order to generate nonconsensual pornography.
What We're Reading
F.T.C. Warns Dozens of Funeral Homes to Provide Accurate Costs to Callers ([link removed])
AI lobbying spikes 185% as calls for regulation surge ([link removed])
As tax season begins, the IRS faces a monumental task: Digitizing a billion pieces of paper ([link removed])


** Follow Our Work:
------------------------------------------------------------
We thank you for your continued support. Without people like you, our work would not be possible.

Here is how you can stay involved and help us accomplish our mission:
1. Follow CfA on Threads ([link removed]) and BlueSky ([link removed])
2. Follow the Tech Transparency Project on Threads ([link removed]) and Bluesky ([link removed])
3. Tell your friends and colleagues ([link removed]) about CfA.
4. Send us a tip ([link removed]) .
5. Make a tax-deductible donation ([link removed]) .

Be on the lookout for more updates about our work in the upcoming weeks. Thanks again for signing up to be a part of CfA!

Sincerely,

Michelle Kuppersmith
Executive Director, Campaign for Accountability

============================================================

** Website ([link removed])

Copyright © 2024 Campaign for Accountability, All rights reserved.
You signed up for this list at campaignforaccountability.org

Our mailing address is:
Campaign for Accountability
611 Pennsylvania Ave SE
#337
Washington, District Of Columbia 20003
USA
Want to change how you receive these emails?
You can ** update your preferences ([link removed])
or ** unsubscribe from this list ([link removed])
.
Email Marketing Powered by Mailchimp
[link removed]
Screenshot of the email generated on import

Message Analysis