From Miriam Bastian, FSF <[email protected]>
Subject Psychological care should grant you all the freedoms and protection you deserve
Date January 8, 2025 7:13 AM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
*Please consider adding <[email protected]> to your address book, which
will ensure that our messages reach you and not your spam box.*

*Read and share online: <[link removed]>*

Dear Free Software Supporter,

**The rise of proprietary parenting software promising to improve
mental health is concerning. Psychological care should grant you all
the freedoms and protection you deserve!**

"[Imagine an AI parenting companion that's always in your corner –
ready to answer questions or talk about how you're feeling at any time
of the day (or night)][1]," posts one of the authors from the venture
capital firm Andreessen Horowitz, who writes about large language
models (LLM) and machine-learning companies, and seems to think this
is a bright vision. More accurately, this is a nightmare when the app
is proprietary and the company collects your data like Mamatech LTD
does with its app Soula.

[1]: [link removed]

Soula claims to be the "[first ethical AI in maternity][3]" and "to
[empower women to flourish as individuals and mothers][4]." While it's
debatable if software is the right solution here (I personally think
we need more human support), I fully agree with the founder of Soula
that we need to support pregnant and postpartum parents much more!
Therefore, it would be fabulous to have a truly ethical software that
helps parents find information and advice in difficult phases during
pregnancy and parenthood.

If Soula really does a job as important as it promises to do,
empowering "women to flourish as individuals and mothers," it is even
more essential that it grants its users software freedom, i.e. the
freedom to run, modify, copy, and share the software. Don't you agree
that all parents should be able to benefit from the help that Soula
promises? Shouldn't they be able to use it whenever, wherever, and
however they see fit and to share it with other caregivers who can
also be supported by it? And shouldn't researchers be able to study
what the machine learning application does, refine it, and adapt it
for similarly important use cases? The problem, besides the question
of how sensible it is to rely on a machine learning application that,
contrary to its description, "AI" is not intelligent at all, is that
the app used to communicate with the LLM is [proprietary][5].

[Proprietary software][5] takes away your freedom to run, modify,
copy, and share the program and instead gives the developers and the
company that owns the software power over its users. In the case of
Soula, its [terms of use][6] prohibit the user from modifying or
reverse engineering the app. They state that "you may not copy, store,
modify, distribute, transmit, perform, reproduce, publish, license,
create derivative works from ... software or code obtained from the
App (Website) without our prior express written consent which may be
withheld for any or no reason." In addition, the access to the app
and therefore to the service is revocable and you may not share it. On
top of that Mamatech LTD, the company that owns Soula, collects
sensitive data of parents in their most vulnerable moments. This is
unethical, no matter how ethical the founder's intention may be!

[3]: [link removed]
[4]: [link removed]
[5]: [link removed]
[6]: [link removed]

## Data-hungry LLMs

With [more than 100k users][7], Soula is a popular choice for
[investment companies][8] looking for opportunities to make money. The
service is based on ChatGPT-3, one of the LLMs described as "so
data-hungry and intransparent that we have even less control over what
information about us is collected, what it is used for, and how we
might correct or remove such personal information," by [Jennifer
King][9], the author of the white paper [*Rethinking privacy in the AI
era: Policy provocations for a data-centric world*][10]. It was for
the same reasons that the Italian data protection authority
temporarily banned ChatGPT in 2023, after having scrutinized ChatGPT's
data practices. In March 2023, a simple bug even led to [descriptions
of conversations being disclosed to other users][11] by ChatGPT's chat
history.

[7]: [link removed]
[8]: [link removed]
[9]: [link removed]
[10]: [link removed]
[11]: [link removed]

## Soula registers your menstruation cycle, sexual activities, and mental well-being

Soula itself explicitly processes personal data for commercial
interest according to its [privacy policy][12]: "...we analyze[s]
users' behavior on our services to operate advertising and marketing
campaigns, to measure their effectiveness." And the amount of data
that this app collects is immense! The data collected ranges from the
users' date of birth, menstruation cycle, sexual activities, and
physical and mental well-being to the messages they write to the
chatbot. If collecting such sensitive data was not enough, Soula also
surrenders this sensitive information to other companies such as
Amplitude, Inc., Firebase, and of course, OpenAI.

[12]: [link removed]

For US residents, the use of proprietary software for psychological
care and the [tracking of your period or pregnancy][13] is especially
delicate because US authorities can subpoena your sensitive data from
Soula. When registering, you have to provide the city and the state of
your residence. If you live in a state that bans abortion after a
certain length of gestation and you decide to (or must for medical
reasons) end your pregnancy, the information you revealed while using
Soula or a similar app can be used against you in criminal
courts. Even if the app has been deleted, any user data already
entered can be requested from Soula for legal proceedings.

[13]: [link removed]


## Proprietary mental health and parenting apps are on the rise

Soula is just one of several apps on the rise in the sector of
psychological care and parenting. Other examples of mental health apps
are Youper, BetterHelp, Woebot, Pray.com, and Talkspace, all of which
come with the same issues of running on nonfree apps and [in terms of
privacy infringements some are even worse][14] than Soula. Other
companies, like [Cradlewise and Nanit][15], see profit in catering to
parents. Their baby monitors tell you if your child is breathing and
even rock the crib for you. Think twice before you entrust these
companies with caring for your baby or your intimate thoughts, which
aren't so private if disclosed to one of these corporations.

[14]: [link removed]
[15]: [link removed]

## Call To Action

We strongly recommend not using Soula, Youper, BetterHelp or any other
related proprietary apps for psychological care or as a [daily
journal][16]. Instead, we recommend the use of local, free apps (like
a [journaling program][17]), good old paper journals, talking to your
friends, and seeking in-person help from psychologists and doctors.
Should we be so quick to rely on software made by a for-profit
corporation to improve mental health? Don't we rather need decent
psychological care and people around us who support us? If you live in
a country where it is difficult to get an appointment with a doctor or
psychologist we recommend that you appeal to your government to
improve the health care system so that people don't need to give up
their freedom and privacy to companies like Mamatech LTD. Soula and
the like may seem convenient, but it's not worth losing your
freedom. Remember: [you are worthy of all the privacy you want and
need!][18] And you are worthy of the [four freedoms][19]!

[16]: [link removed]
[17]: [link removed]
[18]: [link removed]
[19]: [link removed]


The FSF is currently having its [year-end fundraiser][20].
[Donate][18] to help us reach our [fundraising goal][21] of USD
$400,000 and to conduct research and write articles like this to
educate users about their rights and about the pitfalls of proprietary
software and other infringements on user freedom!

[20]: [link removed]
[21]: [link removed]


Yours in freedom,

Miriam Bastian
Program Manager
--
Interested in helping us expand our reach?

* Follow us on Mastodon at <[link removed]> and PeerTube at <[link removed]>, showing your support for federated social networks.
* Get active on the LibrePlanet wiki: <[link removed]>
* Share on your blog or [social network]([link removed]) that you support us, and why you do so.
* Subscribe to our RSS feeds: <[link removed]>
* Join us as an associate member: <[link removed]>; and display your membership button (<[link removed]>) on your website.

Read our Privacy Policy: <[link removed]>

Sent from the Free Software Foundation,

31 Milk Street
# 960789
Boston, Massachusetts 02196
United States


You can unsubscribe from this mailing list by visiting

[link removed].

To stop all email from the Free Software Foundation, including Defective by Design,
and the Free Software Supporter newsletter, visit

[link removed].
Screenshot of the email generated on import

Message Analysis