Добавить новость
News in English
Новости сегодня

Новости от TheMoneytizer

ChatGPT goes rogue with incorrect responses and gibberish interactions

ChatGPT appeared to go rogue on Tuesday evening as users reported the AI chatbot responding with incorrect answers and talking gibberish, as reported by 404Media.

Members of a ChatGPT subreddit shared their experiences with screenshots of extraordinary exchanges with the technology, responses either making no sense or answers to questions being way off the mark.

One screenshot showed this response to a question: “This is the work of service and any medical today to the data field.” It continued: “The 12th to the degree and the pool to the land to the top of the seam, with trade and feature, can spend the large and the all before it’s under the care.”

Meanwhile, another user explained that they asked ChatGPT for “a synonym for overgrown” and got the response: “A synonym for “overgrown” is “overgrown” is “overgrown” is “overgrown”…”

Other users claimed it gave totally incorrect answers to the basic questions, such as responding with “Tokyo, Japan” when asked to name the biggest city on earth that begins with an ‘A.’

OpenAI, the creators of ChatGPT, has confirmed it has fixed the issue in a message on its status page. Still, it’s another reminder that while we’re in the middle of an AI boom, the technology is not yet immune to going rogue or, quite simply, going wrong.

AI models like ChatGPT have a long way to go

This is just another example of AI technology proving it’s not yet capable of earning complete trust from its users, despite fears that Artificial Intelligence has the potential to replace humans in a variety of day-t0-day tasks, both at home and in the workplace.

There have already been several instances where lawyers have gotten in trouble for citing fictitious cases generated by AI. Just last month, Reuters reported that a New York lawyer was facing disciplinary action after they used ChatGPT for research in a medicinal malpractice lawsuit and failed to confirm that the case cited was valid.

Featured Image: Photo by Jonathan Kemper on Unsplash

The post ChatGPT goes rogue with incorrect responses and gibberish interactions appeared first on ReadWrite.

Читайте на сайте


Smi24.net — ежеминутные новости с ежедневным архивом. Только у нас — все главные новости дня без политической цензуры. Абсолютно все точки зрения, трезвая аналитика, цивилизованные споры и обсуждения без взаимных обвинений и оскорблений. Помните, что не у всех точка зрения совпадает с Вашей. Уважайте мнение других, даже если Вы отстаиваете свой взгляд и свою позицию. Мы не навязываем Вам своё видение, мы даём Вам срез событий дня без цензуры и без купюр. Новости, какие они есть —онлайн с поминутным архивом по всем городам и регионам России, Украины, Белоруссии и Абхазии. Smi24.net — живые новости в живом эфире! Быстрый поиск от Smi24.net — это не только возможность первым узнать, но и преимущество сообщить срочные новости мгновенно на любом языке мира и быть услышанным тут же. В любую минуту Вы можете добавить свою новость - здесь.




Новости от наших партнёров в Вашем городе

Ria.city
Музыкальные новости
Новости России
Экология в России и мире
Спорт в России и мире
Moscow.media






Топ новостей на этот час

Rss.plus





СМИ24.net — правдивые новости, непрерывно 24/7 на русском языке с ежеминутным обновлением *