Grok is spreading inaccurate info again, this time about the Bondi Beach shooting

In the same month that Grok opted for another holocaust rather than vaporize Elon Musk’s brain, the AI ​​chatbot is on the fritz again. Grok is responding to user requests with inaccurate or completely unrelated information, as seen for the first time, following the Bondi Beach shooting in Australia during a celebration to mark the start of Hanukkah gizmodo,

According to the latest news reports, Grok’s allusion appears to be most evident from a viral video showing a 43-year-old bystander, identified as Ahmed Al Ahmed, snatching a gun from an attacker who has killed at least 16 people. Grok’s reactions reveal that he has repeatedly misidentified the man who stopped one of the gunmen. In other cases, Grok responds to the same image about the Bondi Beach shooting with irrelevant details about allegations of targeted civilian shootings in Palestine.

The latest replies still reflect Grok’s confusion about the Bondi Beach shooting, even making unrelated requests to provide information about the incident or mixing it up with the shooting at Brown University in Rhode Island. xAI, the developer of Grok, has yet to officially comment on what’s happening with its AI chatbot. However, this isn’t the first time Grok has gone off the rails, earlier this year it branded itself MechaHitler.



<a href

Leave a Comment