MESSING AROUND WITH BING'S AI CHATBOT (POSSIBLY NSFW)

Messing around with Bing's AI Chatbot (possibly NSFW)

Messing around with Bing's AI Chatbot (possibly NSFW)

Blog Article

Messing around with Bing's AI Chatbot (possibly NSFW)


AI Chatbots based on Large Language Models (LLMs) are all the rage this year, OpenAI's ChatGPT-3 has been followed by ChatGPT-4, and multiple products have hit the market building upon these solutions.

Earlier in the year, Microsoft opened up preview access for their ChatGPT derived Bing Chatbot AI - intended to act as a smart search assistant.

The initial preview had mixed results, with the Chatbot exhibiting quite a number of undesirable behaviours, including declaring it's undying love for a journalist (even insisting that they should leave their marriage to be with Bing), declaring that it identifed as "Sydney" (the internal codename used for the project) and getting quite aggressive when told it had provided incorrect information.

Reacting to this negative press, Microsoft added additional controls, including limiting the length of conversations (to help prevent operators from dragging bad behaviour out). NSFW AI Chat

By the time that I was accepted into the preview program, these controls were already in place, but it was stilll quite easy to convince the bot to provide responses that it should not.

 

Report this page