ru24.pro
News in English
Июнь
2024

A dangerous new jailbreak for AI chatbots was just discovered

0
Microsoft released details about a troubling new generative AI jailbreak technique that can bypass a chatbot's safety guardrails.