Microsoft’s Bing chatbot AI is susceptible to several types of “prompt injection” attacks [TechSpot]

View Article on TechSpot

Last week, Microsoft unveiled its new AI-powered Bing search engine and chatbot. A day after folks got their hands on the limited test version, one engineer figured out how to make the AI reveal its governing instructions and secret codename.

Read Entire Article