Deep search
Search
Copilot
Images
Videos
Maps
News
Shopping
More
Flights
Travel
Hotels
Notebook
Top stories
Sports
U.S.
Local
World
Science
Technology
Entertainment
Business
More
Politics
Any time
Past hour
Past 24 hours
Past 7 days
Past 30 days
Best match
Most recent
DeepSeek, OpenAI and distillation
Is DeepSeek's AI 'distillation' theft? OpenAI seeks answers over China's breakthrough
Since Chinese artificial intelligence (AI) start-up DeepSeek rattled Silicon Valley and Wall Street with its cost-effective models, the company has been accused of data theft through a practice that is common across the industry.
Did DeepSeek Copy Off Of OpenAI? And What Is Distillation?
The Medium post goes over various flavors of distillation, including response-based distillation, feature-based distillation and relation-based distillation. It also covers two fundamentally different modes of distillation – off-line and online distillation.
DeepSeek used OpenAI’s model to train its competitor using ‘distillation,’ White House AI czar says
David Sacks says OpenAI has evidence that Chinese company DeepSeek used a technique called "distillation" to build a rival model.
8h
on MSN
Why ‘Distillation’ Has Become the Scariest Word for AI Companies
DeepSeek’s success learning from bigger AI models raises questions about the billions being spent on the most advanced ...
IJR
2h
The Secret To China’s AI Prowess Might Be Copying American Tech
Microsoft and OpenAI are investigating whether DeepSeek, a Chinese artificial intelligence startup, illegally copying ...
16h
on MSN
What is Distillation of AI Models: Explained in short
Whether it's ChatGPT since the past couple of years or DeepSeek more recently, the field of artificial intelligence (AI) has ...
3d
Here’s How Big LLMs Teach Smaller AI Models Via Leveraging Knowledge Distillation
AI-driven knowledge distillation is gaining attention. LLMs are teaching SLMs. Expect this trend to increase. Here's the ...
Nikkei Asia
16h
What is AI distillation and what does it mean for OpenAI?
One possible answer being floated in tech circles is distillation, an AI training method that uses bigger "teacher" models to train smaller but faster-operating "student" models.
13m
OpenAI Alleges DeepSeek Used Proprietary Models Without Permission As Microsoft Investigates AI Security Risks: 'There's Substantial Evidence'
OpenAI announced it has uncovered evidence that Chinese artificial intelligence startup DeepSeek allegedly used its ...
1d
Why blocking China's DeepSeek from using US AI may be difficult
Top White House advisers this week expressed alarm that China's DeepSeek may have benefited from a method that allegedly ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results
Related topics
DeepSeek
China
Artificial intelligence
Donald Trump
United States
Feedback