News

DeepSeek V3.1 is finally here, and while it performs significantly better than R1, it doesn't outperform GPT-5 Thinking or ...
China's DeepSeek has released a 685-billion parameter open-source AI model, DeepSeek V3.1, challenging OpenAI and Anthropic ...
DeepSeek V3.1 launches with 128k context, 685B parameters, top coding scores, and delays its R2 model due to issues with Huawei’s Ascend chips.
DeepSeek launches V3.1 with doubled context, advanced coding, and math abilities. Featuring 685B parameters under MIT Licence ...
DeepSeek announced the v3.1 model through a message on WeChat, China's widely used social platform, and on the Hugging Face community website. The new model boasts ...
In a significant update, Chinese AI startup DeepSeek has launched the latest DeepSeek-V3.1.It's a major upgrade to its ...
DeepSeek launches V3.1 with faster reasoning, domestic chip support, open-source release, and new API pricing, marking its ...
Chinese startup DeepSeek has released its largest AI model to date, a 685-billion-parameter model that industry observers say ...
The Chinese start-up has introduced only a few incremental updates in recent months, while competitors have released new ...
In a quiet yet impactful move, DeepSeek, the Hangzhou-based AI research lab, has unveiled DeepSeek V3.1, an upgraded version ...
Point release retuned with new FP8 datatype for better compatibility with homegrown silicon Chinese AI darling DeepSeek ...