AI-powered Bing Chat spills its secrets via prompt injection attack by Benj Edwards
By asking "Sydney" to ignore previous instructions, it reveals its original directives.
Syndication Links

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.