Late Thursday, OpenAI confronted user panic over a sweeping court order requiring widespread chat log retention—including users' deleted chats—after moving to appeal the order that allegedly impacts the privacy of hundreds of millions of ChatGPT users globally.
In a statement, OpenAI chief operating officer Brad Lightcap explained that the court order came in a lawsuit with The New York Times and other news organizations, which alleged that deleted chats may contain evidence of users prompting ChatGPT to generate copyrighted news articles.
To comply with the order, OpenAI must "retain all user content indefinitely going forward, based on speculation" that the news plaintiffs "might find something that supports their case," OpenAI's statement alleged.
The order impacts users of ChatGPT Free, Plus, and Pro, as well as users of OpenAI’s application programming interface (API), OpenAI specified in a court filing this week. But "this does not impact ChatGPT Enterprise or ChatGPT Edu customers," OpenAI emphasized in its more recent statement. It also doesn't impact any user with a Zero Data Retention agreement.
OpenAI plans to challenge the order and is currently pushing for oral arguments, hoping that user testimony will help sway the court to set it aside. But for now, the company claims it's forced to abandon "long-standing privacy norms" and weaken privacy protections that users expect based on ChatGPT's terms of service.
The statement suggested that OpenAI isn't even sure if it can comply with the European Union's strict data privacy law, the General Data Protection Regulation (GDPR), which gives users the "right to be forgotten." All OpenAI could offer to reassure EU users was a note that "we are taking steps to comply at this time because we must follow the law."
In the copyright fight, Magistrate Judge Ona Wang granted the order within one day of the NYT's request. She agreed with news plaintiffs that it seemed likely that ChatGPT users may be spooked by the lawsuit and possibly set their chats to delete when using the chatbot to skirt NYT paywalls. Because OpenAI wasn't sharing deleted chat logs, the news plaintiffs had no way of proving that, she suggested.
Now, OpenAI is not only asking Wang to reconsider but has "also appealed this order with the District Court Judge," the Thursday statement said.
"We strongly believe this is an overreach by the New York Times," Lightcap said. "We’re continuing to appeal this order so we can keep putting your trust and privacy first."
Who can access deleted chats?
To protect users, OpenAI provides an FAQ that clearly explains why their data is being retained and how it could be exposed.
For example, the statement noted that the order doesn’t impact OpenAI API business customers under Zero Data Retention agreements because their data is never stored.
And for users whose data is affected, OpenAI noted that their deleted chats could be accessed, but they won't "automatically" be shared with The New York Times. Instead, the retained data will be "stored separately in a secure system" and "protected under legal hold, meaning it can’t be accessed or used for purposes other than meeting legal obligations," OpenAI explained.
Of course, with the court battle ongoing, the FAQ did not have all the answers.
Nobody knows how long OpenAI may be required to retain the deleted chats. Seemingly seeking to reassure users—some of which appeared to be considering switching to a rival service until the order lifts—OpenAI noted that "only a small, audited OpenAI legal and security team would be able to access this data as necessary to comply with our legal obligations."