MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
microsoft
Search

'Skeleton Key' attack unlocks the worst of AI, says Microsoft

Friday June 28, 2024. 08:38 AM , from TheRegister
Simple jailbreak prompt can bypass safety guardrails on major models
Microsoft on Thursday published details about Skeleton Key – a technique that bypasses the guardrails used by makers of AI models to prevent their generative chatbots from creating harmful content.…
https://go.theregister.com/feed/www.theregister.com/2024/06/28/microsoft_skeleton_key_ai_attack/

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Nov, Tue 5 - 09:28 CET