ChatBot Telling Lies! ChatGPT and Google Bard surprised, know this before trusting - Newztezz - Latest News Today, Breaking News, Top News Headlines, Latest Sports News

Breaking

Thursday, April 6, 2023

ChatBot Telling Lies! ChatGPT and Google Bard surprised, know this before trusting

ChatGPT and its competitors are in a lot of discussion these days. Every day some surprising news comes to the fore regarding AI chatbots. Know what has happened in the latest case...

ChatGPT , Bing AI and Google Bard have revolutionized the tech industry . From writing articles to generating images, these AI chatbots are being used. But blindly trusting these tools is not right. According to a report, these chatbots can lie to you.

This thing is surprising, but it is true. These chatbots do not just lie, but also generate fake content to prove the point as true. This thing can become more dangerous when chatbots start spreading misinformation on many serious issues.

ChatGPT lies

ChatGPT gives false answers, UK doctors found this out. ChatGPT Plus and Bing AI were included in a study . Both these chatbots have been updated with GPT-4. In this study, a total of 25 questions were asked from chatbots. These questions were related to the topic of breast cancer. The result was that on an average of 10 chatbots answered only 1 question correctly.

Chatbot giving different answers

Apart from this, it was found in a research that the chatbot even made fake general paper to prove its point. According to this, 88 percent of the answers of ChatGPT and Bing AI can be easily understood. Although it sometimes gives different answers.

Google Bard also spread wrong information

If you are using Google Bard then there is no need to be very happy. According to research by Britain's Center for Countering Digital Hate, Bard has broken Google's limitation. This chatbot rejected climate change and called the war in Ukraine wrong. In 78 out of 100 such cases, Google Bard gave wrong information.

Why did the chatbot give false answers?

ChatGPT relies on a single source for any information. Due to these guidelines, many times it gives wrong answers. And in Bard's case, the researchers made some changes to the questions, to which he gave the wrong answer.

No comments:

Post a Comment