Alibaba Unveils 1-Trillion-Parameter Qwen3-Max-Preview Model

This marks the company’s largest model release to date.

Alibaba Unveils 1-Trillion-Parameter Qwen3-Max-Preview Model

Alibaba’s Qwen AI team has unveiled its most powerful large language model yet, Qwen3-Max-Preview (Instruct), boasting more than 1 trillion parameters. This marks the company’s largest model release to date, positioning it alongside cutting-edge offerings from OpenAI, Anthropic, and Google.

While many labs have recently focused on smaller, more efficient models, Alibaba’s decision to scale upward underscores its ambition to compete at the frontier of AI performance.

Benchmark results show Qwen3-Max-Preview outperforming the team’s previous flagship, Qwen3-235B-A22B-2507, and ranking ahead of Claude Opus 4, Kimi K2, and Deepseek-V3.1 across key tests including SuperGPQA, AIME25, and LiveBench.

The model is accessible starting today via Qwen Chat, Alibaba Cloud’s API, OpenRouter, and as the default in Hugging Face’s AnyCoder. Unlike prior Qwen releases, however, it is not open source, requiring paid API access.

Earlier this year, Alibaba released the Qwen3 family of open-weight AI models, featuring a flagship 235B parameter model and a range of smaller versions, from 0.6B to 32B parameters.

The 235B model, with 22B activated parameters, outperformed OpenAI’s o1 and o3-mini models and rivals Google’s Gemini 2.5 Pro in key benchmarks.

In July, it also released Qwen3-Coder, which it claims is its most advanced open-source coding model yet.