Back    Zoom +    Zoom -
BABA-W's Tongyi Qianwen Unveils Qwen3 Series AI Foundation Models
Recommend
20
Positive
33
Negative
11
BABA-W (09988.HK)(BABA.US) unveiled its Qwen3 series of AI foundation models under Tongyi Qianwen, featuring two Mixture of Experts (MoE) models and six dense models with parameters ranging from 0.6B to 235B.

The flagship Qwen3-235B-A22B competes strongly against top models like DeepSeek-R1, OpenAI’s o1 and o3-mini, xAI’s Grok-3, and Google’s Gemini-2.5-Pro, excelling in coding, math, and general capability benchmarks.

Related NewsG Sachs Raises 2025 HK Stocks' Southbound Inflow Forecast to US$110B
The compact MoE model Qwen3-30B-A3B, with just 10% of QwQ-32B’s active parameters, outperforms it, while the small Qwen3-4B model matches the performance of Qwen2.5-72B-Instruct.
AAStocks Financial News