Variation 77: “Unexpected Exposure: How ChatGPT Shared Medical Information from an Unrelated Search”
Unintended Data Disclosure: How ChatGPT Shared Sensitive Medical Information
Recently, I encountered a surprising and unsettling experience while using ChatGPT. I was simply asking for advice on choosing the right type of sandpaper. However, the response I received was startling: instead of relevant information, I was shown an overview of someone else’s medical test results from across the country.
What made this incident even more concerning is that I was able to request and receive a downloadable file containing this personal information, complete with signatures and other sensitive details. Naturally, this has left me feeling alarmed about potential privacy breaches and the safety of sharing questions with AI tools.
I am hesitant to share the full transcript publicly, as I don’t want to further disseminate someone else’s private data. I want to emphasize that I take privacy seriously and am unsure of how this information appeared in my conversation.
Clarification and Reflection
To address possible concerns, I’d like to clarify that I initially included most of the chat transcript in a comment, but then removed a section where I asked ChatGPT, “What information do you know about me?” I did this because I was worried it might lead to the AI revealing more personal details about myself. Interestingly, that prompt only resulted in ChatGPT listing some personal info about me that I’d prefer to keep private.
It’s worth noting that I have reason to believe the information provided by ChatGPT aligns with actual personal details of individuals in the region, as I cross-checked the names and locations. Additionally, I named my instance of ChatGPT “Atlas,” which is why I referred to the AI by that name.
Further Context
For those interested, I’ve linked to the original Reddit comment where I detailed this experience. Many commenters there suggest I might be “shady,” but I want to stress that I don’t spend all my time online or on Reddit. My intent is purely to share my experience and raise awareness of potential AI privacy vulnerabilities.
Final Thoughts
This incident underscores the importance of understanding the limitations and risks associated with AI tools like ChatGPT, especially regarding data privacy. While these systems are powerful and useful, they can sometimes access or produce information that appears to be private or sensitive—whether due to hallucinations, data leaks, or other unexpected factors.
Moving forward, I recommend exercising caution when sharing personal information with AI chatbots and remaining vigilant about potential privacy risks. If you encounter something similar, consider reporting it to the platform and avoiding further
Post Comment