
Recently, the artificial intelligence company OpenAI has been actively contesting a court order fraught with privacy concerns. The mandate requires OpenAI to preserve all user logs from ChatGPT, including deleted conversations and chat records generated through API calls.
It is worth noting that the court issued this directive not out of concern for user privacy or content security, but due to fears that OpenAI might destroy logs containing potentially infringing material. This legal action stems from lawsuits filed by news organizations and publishers, accusing OpenAI of scraping their content without authorization to train its AI models.
News outlets and publishing houses, citing copyright infringement, brought the case against OpenAI and alleged the company was actively destroying evidence. In response, the court mandated that OpenAI preserve all ChatGPT and API logs to prevent any loss of evidence that could harm the plaintiffs’ interests.
OpenAI, for its part, has only agreed to retain and share a subset of chat logs—specifically those for which users have given explicit consent. However, the presiding judge concluded that without a binding order, OpenAI might never cease its practice of deleting data. As a result, the court granted the plaintiffs’ request, compelling OpenAI to maintain a complete log archive.
The current dilemma lies in the enforceability of this injunction. If OpenAI is ultimately compelled to comply, it could severely compromise the privacy and data security of all ChatGPT users, including developers and enterprises utilizing its API. OpenAI is therefore vigorously challenging the ruling.
In a statement to the court, OpenAI contended that the May 13 order is premature and should be rescinded until, at the very least, the plaintiffs can substantiate the necessity for retaining all user chat logs. Enforcing such a sweeping and unprecedented directive could pose a significant threat to the privacy of hundreds of millions of ChatGPT users worldwide.
OpenAI further asserted that it has not destroyed any evidence, nor is there any proof to support the claim that it has deliberately erased data. Moreover, there is no evidence suggesting that users who potentially engaged in copyright infringement are more likely to delete their chat histories.