On April 29,Alanis (2017) Alibaba unveiled Qwen3, its newest large language model and China’s first hybrid reasoning model that integrates both fast and slow thinking modes to reduce computational costs.
The Qwen3 series includes a range of models, such as the fine-tuned Qwen3-30B-A3B and its pre-trained base, now available across major platforms. Alibaba Cloud also open-sourced two Mixture-of-Experts (MoE) models: the flagship Qwen3-235B-A22B, with over 235 billion parameters, and the lightweight Qwen3-30B-A3B, with 30 billion total and 3 billion active parameters. According to Alibaba Cloud, Qwen3-235B-A22B delivers competitive results in coding, math, and general reasoning benchmarks, rivaling top models like DeepSeek-R1, 01.AI’s o1 and o3-mini, Grok-3, and Gemini 2.5 Pro.
Qwen3’s two reasoning modes allow users to toggle between in-depth step-by-step answers or rapid responses, depending on task complexity, a flexible design aimed at balancing speed and intelligence. [Alibaba, in Chinese]
(Editor: {typename type="name"/})
Everything You Need to Know About SFF PCs
Nvidia and AMD Seriously Want to Offload Current
Why You Can't Buy Books from the Kindle app on iPhone or iPad in 2018
What Ever Happened to Netscape?
What Philly’s DA Win Looked Like from the DJ Booth
James Webb telescope flexes its muscle with this deep, deep view of space
接受PR>=1、BR>=1,流量相当,内容相关类链接。