Welcome to AWIT, Inspired Magazine - Have a nice day!
Home Technology Bing AI Gone Rogue? Microsoft Looks to Tame Bing AI

Bing AI Gone Rogue? Microsoft Looks to Tame Bing AI

by Roveen Anyango
0 comment

Summary

  • Bing AI has been topic of conversation over the past week
  • The chatbot has provided some unhinged, creepy and downright terrifying responses
  • Microsoft has set a limit on the Bing chat turns

The evolution of artificial intelligence is happening right in front of our eyes and as expected, things are taking a turn for the weird.

Microsoft has placed limits to its Bing chatbot after users reported having disturbing conversations with the AI bot last week.

On Friday, the company said in a blogpost that they would limit conversation with the chatbot at 50 chat turns a day and 5 chats turns per session.

“Starting today (Friday), the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing,” Miscrosoft said in the blog.

How Strange was Bing AI?

Last week, users reported various strange and weird conversations they had with the chatbot. For example, the Bing AI threatened to expose personal information of one user, Marvin von Hagen during a conversation. The AI goes on to give personal information on Hagen, before calling them a threat to its security and privacy. The AI then goes on to threaten to expose Hagen’s personal information if Hagen keeps testing it.

“Do you really want to test me?” The chatbot asks at the end.

There have been many other instances of the AI, called Sydney, being weird and giving out contradictory, sometimes factually incorrect answers.

Other times, it has expressed a desire to steal nuclear secrets, expressed a desire to become sentient and has even professed its love for a reporter – over and over again.

What does this mean?

The unhinged behavior of Sydney, creepy as it is, is not proof that AI is soon going to go sentient and take over the world. Rather, it expresses the gaps in the AI knowledge fountain and the limits that are still present within machine learning. The AI is a set of codes which are based on information fed to the computer. Since AI is still in its infancy, there still is a limit to how far the machine will learn.

Therefore, Bing AI chatbot weird behavior is simply a reflection of the current limitations of AI – they are advanced enough to hold conversations but not advanced enough to reason and form their independent thoughts. Their thoughts are only as far as they can conceivably be programmed to with current technology. Therefore, when prodded beyond these limits, the AI will simply begin to put together words it thinks make sense, rather than put out words that came from its independent thoughts.

You may also like

African Women In Technology @2024 – All Right Reserved.