August 11, 2025
Young People Are Using ChatGPT for Therapy, but the Conversations Lack Privacy and Protections
By Nic Wong
Therapy is now one of the leading uses of AI. But unlike traditional therapy notes, which are privileged, transcripts with chatbots carry no such shield.

On May 13, Magistrate Judge Hon. Ona T. Wang issued an order mandating OpenAI to override its privacy policies and retain all ChatGPT user queries. In the copyright infringement case of The New York Times Company et al. v. Microsoft Corporation et al., the court found that the archives must be preserved for the sake of prospective legal actions relating to the use of the service. OpenAI appealed the decision, writing in a statement that it “fundamentally conflicts with the privacy commitments” made to users. But for now, the order stands. The impact of this decision is enormous: OpenAI CEO Sam Altman estimates that ChatGPT fields roughly 300 million new queries each week.
But one group of users is especially vulnerable to adverse exposure under this new ruling: survivors of sexual assault and other traumas who turn to AI in the absence of other support networks, under the expectation that their identities and experiences won’t be exposed unless and until they are prepared to come forward. But now, every message—whether archived, drafted, or sent in temporary chat mode—will be preserved indefinitely.