Navigation
Search
|
'Skeleton Key' attack unlocks the worst of AI, says Microsoft
Friday June 28, 2024. 08:38 AM , from TheRegister
Simple jailbreak prompt can bypass safety guardrails on major models
Microsoft on Thursday published details about Skeleton Key – a technique that bypasses the guardrails used by makers of AI models to prevent their generative chatbots from creating harmful content.…
https://go.theregister.com/feed/www.theregister.com/2024/06/28/microsoft_skeleton_key_ai_attack/
Related News |
25 sources
Current Date
Nov, Tue 5 - 09:28 CET
|