Sharing personal secrets with an AI chatbot can be risky. In early August, many were stunned to find that thousands of ChatGPT conversations were publicly accessible through search engines like Google.
While OpenAI reacted promptly and removed the dangerous sharing functionality, the incident reveals the unsettling truth that people trust AI chatbots way too much.
An organised racket has reportedly siphoned off lakhs from Standard Chartered Bangladesh's (SCB) credit card holders, raising serious cybersecurity concerns....
Cybersecurity researchers revealed a zero-click vulnerability in OpenAI ChatGPT's Deep Research agent that lets attackers leak sensitive Gmail inbox data...
SafetyDetectives, a cybersecurity firm, found that ChatGPT users frequently share private and sensitive information with the AI.
Researchers analyzed 1,000 leaked conversations with over 43 million words. They found several chats containing personally identifiable information (PII), including full names, addresses, and ID numbers.
“The negative impact of oversharing with ChatGPT goes beyond the psychological, emotional, and mental factors. Real-world safety concerns are also substantial,” wrote the researchers.
“Not only could PII be used for identity theft and fraud, but delicate details about a user’s life may be used for social engineering scams or blackmail.”
The researcher found that the most common topics were education, law, and law enforcement. These topics aren’t sensitive, but their frequency shows that users are turning to AI assistants for professional-level technical knowledge.
Source: safetydetectives.com
Common chatbot topics include relationships, health, financial advice, and distress.
“We recommend extra vigilance when engaging with chatbots and other AI platforms that don’t have clear privacy disclosures or guarantees,” say the researchers.
“PII and other sensitive information shouldn’t be shared with these services, as there have yet to be clear and strict user protection regulations when it comes to AI use.” To read out the full report click here.