Attack

ChatGPT shared conversations indexed by search engines, easily discovered even if potentially confidential

Take action: If you are using ChatGPT, check your "Shared Links" dashboard in account settings and delete any links for sharing. Even deleted chats might still be publicly searchable on Google and other search engines. Never share AI conversations unless you are ABSOLUTELY CERTAIN IT'S LIIMTED TO A SPECIFIC USER OR GROUP and always assume anything you share online will become permanently public.


Learn More

Researchers have located a new leak vector that makes thousands of conversations with OpenAI ChatGPT easily searchable because they were publicly shared and indexed by search engines. 

The leak was discovered in late July 2025 and is caused by a combination of feature of that ChatGPT that allows users to share their AI chats with other people and those same conversations being discoverable and indexed by Google and other search engines. 

This created a scenario where potentially sensitive personal and business information are accessible to the public and easily discoverable by millions of internet users worldwide. 

The basic assumption of the ChatGPT share is that only people with a specific link to the chat would be able to access it. This is bad security since it relies on obscurity, but it's better than what happened to a lot of ChatGPT users:

When users clicks the "Share" button to create a public link for their conversations, they were presented with an optional checkbox labeled "Make this chat discoverable," which would allow search engines to index the content. The feature required users to opt-in by first picking a chat to share, then clicking a checkbox for it to be shared with search engines.

Many users appear unaware of the long-term implications of enabling this feature and were expecting that their conversations will remain within a limited circle of friends, colleagues, or family members.

Security researchers and journalists discovered that a simple search engine query "site:chatgpt.com/share" will return thousands of conversations. Nearly 4,500 conversations were found in Google search results. The exposed conversations ranged from mundane queries about home renovations and recipe ideas to deeply personal discussions involving sensitive information.

Users reported finding conversations with "full legal name, phone number, email, location, and comprehensive work history" as well as people "trying to encode a message to deliver a sketchy package" and various emotional outpourings and trauma discussions.

OpenAI removed the feature from ChatGPT that allowed users to make their public conversations discoverable by search engines. However, the company cautioned that already-indexed content might still appear in search results temporarily due to caching by search engines. Even deleting a conversation from ChatGPT history does not automatically delete the public share link or remove it from search engines. Users must manually delete shared links through ChatGPT's Shared Links dashboard.

Google de-indexed many of the URLs after the incident was reported, other search engines such as Brave, Yahoo, DuckDuckGo, and Bing continued to index the shared conversations. Security researchers noted that modified Google search techniques could still uncover some indexed URLs, and alternative search engines remained effective for discovering the exposed content.

The incident is an example critical concerns about AI governance and data privacy as many users integrate AI tools into their workflows. 

OpenAI has not disclosed the exact number of affected users or the total scope of exposed data beyond the 4,500+ conversations identified by researchers.

For users concerned about their privacy, OpenAI recommends checking the ChatGPT Shared Links dashboard in account settings to review and delete any previously shared conversations. 

ChatGPT shared conversations indexed by search engines, easily discovered even if potentially confidential