Distillation is the practice of training smaller AI models on the outputs of more advanced ones. This allows developers to shortcut the painstaking and costly process of building one from the ground ...
Chinese AI labs allegedly used 24,000 fraudulent accounts to extract capabilities from Anthropic's Claude chatbot in coordinated "distillation" attacks.
David Sacks, U.S. President Donald Trump's AI and crypto czar. David Sacks says OpenAI has evidence that Chinese company DeepSeek used a technique called "distillation" to build a rival model. OpenAI ...
The campaigns detailed by AI upstart entail the use of fraudulent accounts and commercial proxy services to access Claude at scale while avoiding detection. Anthropic said it was able to attribute ...
Anthropic alleges Chinese AI firms used distillation to extract Claude data, raising concerns over AI model training ...
The AI company claims DeepSeek, Moonshot, and MiniMax used fraudulent accounts and proxy services to extract Claude’s ...
In a significant escalation of the tensions surrounding artificial intelligence security, Anthropic issued a formal public complaint on February 23, 2026, ...
David Sacks says OpenAI has evidence that Chinese company DeepSeek used a technique called "distillation" to build a rival model. OpenAI has evidence that China's DeepSeek used OpenAI's models to ...