Bing Chat, an AI chatbot from Microsoft similar to ChatGPT, allows users to upload images for the AI model to examine or discuss. Normally, Bing Chat refuses to solve CAPTCHAs, which are visual puzzles designed to prevent automated programs bots from filling out forms on the web. On Saturday, X user devised a visual jailbreak that circumvents Bing Chat CAPTCHA filter by tricking it into reading...
Dead grandma locket request tricks Bing Chat’s AI into solving security puzzle
Found 67 days ago at Arstechnica