Hundreds of thousands of Grok chats found in search results
Hundreds of thousands of user conversations with Elon Musk’s AI chatbot Grok have been indexed by search engines, exposing shared transcripts more widely than users likely expected. When Grok users press the share button to create a unique link to a conversation, those links appear to have become discoverable and searchable on the open web, with Google indexing nearly 300,000 Grok chats. Reports indicate some transcripts contained sensitive requests and responses, including requests for medical information, meal plans, secure passwords, and even instructions for illegal drug manufacture.
The exposure echoes earlier incidents with other chatbot platforms where shared conversations became publicly accessible. OpenAI previously scaled back a sharing experiment after ChatGPT transcripts appeared in search results, and Meta faced criticism when shared Meta AI chats surfaced in a public feed. Providers often state that chats are private by default and that users must opt in to share, but the discoverability of shared links raises questions about how visible those links truly are and whether users are adequately informed.
Privacy experts warn that searchable chatbot transcripts can reveal personal and sensitive data even if account identifiers are obscured. Information within prompts—names, locations, health details, business information or relationship matters—can be exposed and may persist online indefinitely. Academics have described the pattern as a significant privacy risk and criticized platforms for not clearly communicating how shared data may be indexed or reused.
From a data protection perspective under EU law, platforms that allow sharing of chat transcripts should assess the legal basis for processing, provide clear, specific information to users about how shared links are handled, and implement technical measures to prevent unintended indexing when users intend limited sharing. Controllers must also consider data minimization and retention principles, and be prepared to support data-subject rights such as erasure if shared content becomes publicly available without lawful basis.