A fake dead grandmother's locket was used to break Bing Chat's CAPTCHA filter

Found 67 days ago at Neowin

Microsoft Bing Chat had a lot of weird hallucinations in terms of answers when it first launched earlier in 2023. That included users trying to reveal info via tricky text prompts like its internal code name Sydney . While the AI chatbot has improved a lot in terms of its answers since those early days, some people are still trying to see if it can be tricked into giving information that it is...

Read the article at Neowin

More General News