Alibaba Releases Qwen 3.6 Max Preview, Its Most Powerful AI Model Yet

Key Takeaway

Alibaba has released Qwen 3.6 Max Preview, its strongest AI model so far, with a 256k token context window and top reported results across multiple coding, agent and instruction-following benchmarks. The model is hosted, proprietary and available through Qwen Studio and Alibaba Cloud Model Studio.

Alibaba Releases Qwen 3.6 Max Preview, Its Most Powerful AI Model Yet (Image Credit - ChatGPT, The AI Track)
Alibaba Releases Qwen 3.6 Max Preview, Its Most Powerful AI Model Yet (Image Credit - ChatGPT, The AI Track)

Qwen 3.6 Max Preview – Key Points

The Story

Alibaba released Qwen 3.6 Max Preview on April 20, 2026, as an early preview of the next proprietary flagship model in its Qwen series. The company says the model improves on Qwen3.6-Plus in agentic coding, world knowledge, instruction following and real-world agent reliability. It can be tried on Qwen Studio and accessed through Alibaba Cloud Model Studio under the model string qwen3.6-max-preview. Unlike Alibaba’s open-weight Qwen3.6-27B and Qwen3.6-35B-A3B models, Qwen3.6-Max-Preview is proprietary and hosted.

The Facts

  • Qwen 3.6 Max Preview is Alibaba’s most powerful model to date.

    The model sits at the top of the Qwen3.6 lineup and is positioned as Alibaba’s frontier competitor to models such as OpenAI’s GPT and Anthropic’s Claude.

  • The model is hosted and proprietary.

    Qwen 3.6 Max Preview has no open weights, marking a shift from Alibaba’s earlier reputation for releasing powerful open-source models by default. Lower-end Qwen models remain open source.

  • It is available through Qwen Studio and Alibaba Cloud Model Studio.

    Users can try it interactively on Qwen Studio and call it through Alibaba Cloud Model Studio using the model string qwen3.6-max-preview.

  • The API supports OpenAI and Anthropic specifications.

    Alibaba Cloud Model Studio supports chat completions and responses APIs compatible with OpenAI’s specification, plus an API interface compatible with Anthropic. Qwen3.6-Plus also supports Anthropic API protocol for use with Claude Code.

  • Alibaba reports first-place results across six coding and agent benchmarks.

    According to Qwen’s official blog, the model ranked first on SWE-bench Pro, Terminal-Bench 2.0, SkillsBench, QwenClawBench, QwenWebBench and SciCode.

  • Alibaba reports benchmark gains over Qwen3.6-Plus.

    The model gained 9.9 points on SkillsBench, 6.3 on SciCode, 5.0 on NL2Repo, 3.8 on Terminal-Bench 2.0, 2.3 on SuperGPQA, 5.3 on QwenChineseBench and 2.8 on ToolcallFormatIFBench.

  • Instruction following is a major focus.

    Alibaba says Qwen 3.6 Max Preview delivers stronger instruction following than Qwen3.6-Plus, with improved ToolcallFormatIFBench performance.

  • The model includes preserve_thinking.

    This feature preserves thinking content from preceding turns in messages and is recommended by Alibaba for agentic tasks.

  • The release follows two open-weight Qwen3.6 launches.

    Alibaba released Qwen3.6-35B-A3B three days earlier and later open-sourced Qwen3.6-27B, a dense 27-billion-parameter multimodal model available through Hugging Face, ModelScope, Qwen Studio and Alibaba Cloud Model Studio.

  • The Qwen3.6 family now spans multiple deployment needs.

    The lineup includes Max-Preview for top-end hosted performance, Plus for balanced workloads, Flash for speed-first tasks, 35B-A3B for efficient open-weight use and 27B for dense open-weight multimodal deployment.

  • The model remains a preview.

    Alibaba says Qwen 3.6 Max Preview is still under active development and expects further gains in future versions.

How to Access / Pricing

Qwen 3.6 Max Preview can be tried on Qwen Studio and accessed through Alibaba Cloud Model Studio under the model string qwen3.6-max-preview.

Final API pricing had not been published as of April 2026.

Market Timing

The launch comes as Chinese AI labs are moving from free and open access toward more commercial models. Chinese open models grew from 1.2% of global open-model usage in late 2024 to roughly 30% by the end of 2025, with Qwen leading that shift.

Qwen3.6-Max-Preview also arrives alongside Alibaba’s continued open-weight releases, including Qwen3.6-35B-A3B and Qwen3.6-27B. Qwen has also overtaken Meta’s Llama as the most deployed self-hosted model globally.

Risks / Limitations

Qwen3.6-Max-Preview is not open weight, so developers cannot run or modify it locally. It is also text-only at launch and does not support image input. Alibaba explicitly labels the model as a work in progress, meaning its performance and capabilities may change before a final release.

Because Qwen3.6-Max-Preview is hosted on Alibaba Cloud, the previous source identifies data residency, procurement and compliance as potential adoption blockers for organizations with strict data-sovereignty requirements.

Why This Matters

Qwen 3.6 Max Preview shows Alibaba pushing Qwen further into the frontier AI race while splitting its model strategy between open-weight releases and hosted proprietary models. For developers, the model offers high reported benchmark performance, a large 256k context window and API compatibility with OpenAI and Anthropic workflows, but it also signals that Alibaba’s most capable Qwen models may increasingly sit behind hosted access.


This article was drafted with the assistance of generative AI. All facts and details were reviewed and confirmed by an editor prior to publication.

Alibaba unveiled Qwen3.5 on Feb 16, 2026, with 397B parameters, 17B active per pass, 1M context on Qwen3.5-Plus, and 60% lower costs.

Alibaba and Nvidia integrate Physical AI into Cloud PAI, expand data centers worldwide, and grow AI spending despite China’s chip restrictions.

Alibaba unveils Qwen3 Max (1T params) and open-source Qwen3-Omni under Apache 2.0, posting strong benchmarks and expanding AI investment to $53B.

Alibaba’s Qwen3‑Coder‑480B-A35B-Instruct offers enterprise-grade, long-context open-source AI coding support rivaling top proprietary models.

Read a comprehensive monthly roundup of the latest AI news!

The AI Track News: In-Depth And Concise

Scroll to Top