Navigation
Search
|
Psst … wanna jailbreak ChatGPT? Thousands of malicious prompts for sale
Thursday January 25, 2024. 12:01 PM , from TheRegister
Turns out it's pretty easy to make the model jump its own guardrails
Criminals are getting increasingly adept at crafting malicious AI prompts to get data out of ChatGPT, according to Kaspersky, which spotted 249 of these being offered for sale online during 2023.…
https://go.theregister.com/feed/www.theregister.com/2024/01/25/dark_web_chatgpt/
Related News |
25 sources
Current Date
May, Fri 10 - 16:19 CEST
|