DeepSeek proved knowledge distillation works very well and cheaply https://en.m.wikipedia.org/wiki/Knowledge_distillation
But they didn’t show how to build a new frontier model cheaply.
So, you still need massive investments to build new frontier models. But the bad part, is they can be replicated cheaply
https://stratechery.com/2025/deepseek-faq/
That has a great overview - this is a new model, but also a distillation. They used new techniques to make it really cheap (comparatively).
DeepSeek proved knowledge distillation works very well and cheaply https://en.m.wikipedia.org/wiki/Knowledge_distillation
But they didn’t show how to build a new frontier model cheaply.
So, you still need massive investments to build new frontier models. But the bad part, is they can be replicated cheaply