Critics scoff after Microsoft warns AI feature can infect machines and pilfer data

microsoft copilot windows

The goals are solid, but ultimately they depend on users reading dialog windows that warn about risks and require careful approval before proceeding. In turn, this reduces the value of security for many users.

“The general caveat applies to systems that rely on users clicking the permission prompt,” Arlens Fernandes, a professor at the University of California at San Diego who specializes in AI security, said in an interview. “Sometimes those users don’t fully understand what’s going on, or they may get used to it and click ‘yes’ all the time. At which point, the security limit isn’t really a limit anymore.”

As the “ClickFix” attacks show, many users can be duped into following extremely dangerous instructions. While more experienced users (including a considerable number of Ars commentators) tend to blame victims who fall for such scams, these incidents are inevitable for several reasons. In some cases, even cautious users are tired or in emotional distress and slip as a result. Other users lack the knowledge to make informed decisions.

One critic said, Microsoft’s warning is little more than CYA (short for covering your ass), a legal maneuver that attempts to shield a party from liability.

Critic Reed Miedecke said, “Microsoft (like the rest of the industry) has no idea how to prevent prompt injections or hallucinations, which makes it fundamentally unsuitable for almost anything serious.” “The solution? Shift the responsibility to the user. Like every LLM chatbot has the ‘oops’. By the way, if you use it for anything important be sure to verify the answers” Disclaimer, never mind that if you knew the answers you wouldn’t need a chatbot.

As Miedecke indicated, most of the criticism extends to the AI ​​offerings other companies – including Apple, Google, and Meta – are integrating into their products. Often these integrations start out as optional features and eventually become default capabilities, whether users want them or not.



Leave a Comment