Alibaba Group has publicly committed to continuing its open-source release strategy for AI models, confirming that new entries in both its Qwen large language model family and Wan visual generation series will be made freely available to developers. The announcement, reported across multiple sources on March 22, comes as the Qwen family surpassed 700 million downloads on Hugging Face, with over 170,000 derivative models created by the community.
The scale of adoption places Qwen among the most widely used open-weight model families globally, alongside Meta’s Llama and Mistral’s releases. More than one million corporate and individual users have accessed Qwen through Alibaba’s Model Studio development platform, and the company reports over 600 million total downloads across all distribution channels. These figures position Alibaba as the dominant open-source AI provider in China and a significant competitor to Western alternatives.
Alibaba’s open-source commitment is backed by substantial infrastructure investment. The company has pledged 380 billion yuan ($53 billion) over three years for AI and cloud infrastructure — a figure that exceeds the individual commitments of most Western AI companies outside of Microsoft and Google. This spending covers GPU clusters, data center construction, and the compute required to train successive generations of foundation models that will be released openly.
The strategic logic is straightforward: open-sourcing builds ecosystem lock-in through Alibaba’s cloud platform. Developers who build on Qwen models tend to deploy on Alibaba Cloud, generating recurring revenue from inference compute and enterprise services. This mirrors the approach Red Hat pioneered with Linux — give away the software, sell the infrastructure and support. For Alibaba, each derivative model and each enterprise deployment represents a potential cloud customer.
The commitment also has geopolitical dimensions. As U.S. export controls restrict Chinese access to advanced Nvidia chips, Chinese AI labs face pressure to demonstrate competitiveness through model quality rather than hardware superiority. Alibaba’s willingness to release models openly — and to invest $53 billion in the effort — suggests confidence that its models can compete on merit in a global developer market that increasingly values open weights, permissive licensing, and transparent training methodologies.
