Cybersecurity experts have informed The U.S. Sun that chatbots represent a "treasure trove" ripe for exploitation by criminals. The intelligence of artificial intelligence chatbots is advancing rapidly, becoming more accessible and efficient.
Because these AI systems mimic human conversation so well, there's a temptation to trust them and divulge sensitive information.
Jake Moore, Global Cybersecurity Advisor at ESET, explained that while the AI "models" behind chatbots are generally secure, there are hidden dangers.
"With companies like OpenAI and Microsoft leading the development of chatbots, they closely protect their networks and algorithms," Jake stated. "If these were compromised, it would jeopardize their business future."
A New Threat Landscape
Jake pointed out that the primary risk lies in the potential exposure of the information you share with chatbots.
The details you share during chatbot interactions are stored somewhere, similar to how texts, emails, or backup files are stored. The security of these interactions depends on how well they are stored. "The data you input into chatbots is stored on a server and, despite encryption, could become as valuable as personal search histories to cybercriminals," Jake explained.
"There is already a significant amount of personal information being shared. With the anticipated launch of OpenAI's search engine, even more sensitive data will be at risk in a new and attractive space for criminals."
Jake emphasized the importance of using chatbots that encrypt your conversations. Encryption scrambles data, making it unreadable to unauthorized users.
Fortunately, OpenAI guarantees that all ChatGPT conversations are end-to-end encrypted, whether you're a free or paid user. Avoid sharing personal thoughts and intimate details, as they could be accessed by others.
However, some apps may charge for encryption or not offer it at all. Even encrypted conversations may be used to train chatbot models, although ChatGPT allows users to opt-out and delete their data.
"People must be careful about what they input into chatbots, especially in free accounts that don’t anonymize or encrypt data," Jake advised.
Further, security expert Dr. Martin J. Kraemer from KnowBe4 emphasized the need for caution.
"Never share sensitive information with a chatbot," Dr. Kraemer advised. "You may need to share certain details like a flight booking code with an airline chatbot, but that should be an exception. It's safer to call directly instead of using a chatbot. Never share your password or other authentication details with a chatbot. Also, avoid sharing personal thoughts and intimate details, as these could be accessed by others."