Anthropic accuses DeepSeek and other Chinese firms of using Claude to train their AI

Anthropic claims that DeepSeek and two other Chinese AI companies misused its cloud AI models in an effort to improve their products. In an announcement on Monday, Anthropic says the “industrial scale campaign” involved the creation of approximately 24,000 fraudulent accounts and more than 16 million exchanges with the cloud, as previously reported. wall street journal.

Three companies – DeepSeek, MiniMax and Moonshot – are accused of “distilling” the cloud, or training smaller AI models based on more advanced ones. Although Anthropic says that distillation is a “legitimate training method”, it also says that it “can also be used for illicit purposes”, including “acquiring powerful capabilities from other laboratories in a fraction of the time and developing them independently at a fraction of the cost.”

Anthropic says illegally distilled models are “unlikely” to circumvent existing safety measures. Anthropic writes, “Foreign laboratories that distill American models could feed these untested capabilities into military, intelligence, and surveillance systems—enabling authoritarian governments to deploy frontier AI for offensive cyber operations, disinformation campaigns, and mass surveillance.”

According to Anthropic, DeepSeek, which caused a stir in the AI ​​industry for its powerful but more efficient models, made more than 150,000 exchanges with the cloud and targeted its reasoning capabilities. It is also accused of using the cloud to generate “censorship-safe alternatives to politically sensitive questions about dissidents, party leaders or authoritarianism.” In a letter to lawmakers last week, OpenAI similarly accused DeepSeek of “ongoing efforts to free ride on capabilities developed by OpenAI and other US frontier laboratories.”

Moonshot and Minimax had over 3.4 million and 13 million interactions with Cloud, respectively. Anthropic is calling on other members in the AI ​​industry, cloud providers and lawmakers to address distillation, saying that “restricted chip access” could limit model training and “the scale of illicit distillation.”



<a href

Leave a Comment