MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
malicious
Search

Psst … wanna jailbreak ChatGPT? Thousands of malicious prompts for sale

Thursday January 25, 2024. 12:01 PM , from TheRegister
Turns out it's pretty easy to make the model jump its own guardrails
Criminals are getting increasingly adept at crafting malicious AI prompts to get data out of ChatGPT, according to Kaspersky, which spotted 249 of these being offered for sale online during 2023.…
https://go.theregister.com/feed/www.theregister.com/2024/01/25/dark_web_chatgpt/

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
May, Fri 10 - 16:19 CEST