Hugging Face launches three Qwen 3.5 models—9B‑Base, 2B, and 0.8B—expanding open‑source
Photo by Helena Lopes (unsplash.com/@helenalopesph) on Unsplash
9 billion parameters. That’s the size of Hugging Face’s new Qwen 3.5‑Base model, released alongside 2 billion and 0.8 billion‑parameter variants, expanding the open‑source AI lineup, reports indicate.
Key Facts
- •Key company: Qwen
- •Also mentioned: Hugging Face
Hugging Face’s open‑source catalog now includes three new Qwen 3.5 variants—a 9 billion‑parameter “Base” model, a 2 billion version and a lightweight 0.8 billion model—each uploaded to the platform this week under the Qwen organization. The releases are tagged for “image‑text‑to‑text” and “conversational” pipelines, ship as safetensors, and carry an Apache‑2.0 license, according to the model pages on Hugging Face (see Qwen/Qwen3.5‑9B‑Base, Qwen/Qwen3.5‑2B, Qwen/Qwen3.5‑0.8B)【report】. While download counts remain modest (the 9B model shows zero downloads, the 2B and 0.8B each have six) the additions signal a strategic push to broaden the ecosystem of large language models (LLMs) that can be fine‑tuned without the constraints of proprietary licensing.
The Qwen 3.5 family builds on Alibaba’s internal Qwen line, which the Chinese tech giant announced earlier this month as part of its “agentic AI era” rollout. Reuters reported that Alibaba unveiled the Qwen 3.5 series to accelerate the development of AI agents that can handle multi‑modal tasks, positioning the models as a bridge between pure chatbots and more autonomous assistants【Reuters】. CNBC echoed the narrative, noting that the release marks a shift in China’s chatbot race toward agents capable of integrating vision, language and tool use【CNBC】. By publishing the models on Hugging Face, Alibaba effectively opens a channel for global developers to experiment with the same architecture that powers its internal services, potentially accelerating adoption outside of China’s tightly regulated AI market.
All three variants share a common “image‑text‑to‑text” pipeline, suggesting that they are primed for tasks that combine visual input with natural‑language generation—an area where open‑source alternatives have lagged behind commercial offerings. The model cards list “endpoints_compatible” and “region:us,” indicating that the files are ready for deployment on Hugging Face’s inference endpoints in the United States, a detail that could lower the barrier for startups and research labs seeking low‑latency, multi‑modal capabilities. The 9B‑Base model, despite its larger size, is still modest compared with the 175B parameters of OpenAI’s flagship GPT‑4, but it offers a more tractable compute footprint for organizations without massive GPU clusters.
Hugging Face’s decision to host the Qwen 3.5 models aligns with its broader mission to democratize AI. The platform’s “transformers” library already supports the Qwen family, and the new releases are automatically compatible with the same API, meaning developers can swap in a Qwen 3.5 model with a single line of code. The open‑source community has already begun to test the models; the 4B‑Base variant, which was released earlier, has accumulated 672 downloads—a modest but tangible signal of interest【report】. By contrast, the newest 9B, 2B and 0.8B models have yet to see significant uptake, reflecting the typical lag between model publication and real‑world experimentation.
Analysts see the move as a bid to challenge the dominance of Western LLMs in the open‑source arena. With Alibaba’s backing, the Qwen 3.5 series could attract contributions that improve safety, alignment and multilingual performance—areas where existing open‑source models such as LLaMA and Falcon have faced criticism. If the community embraces the models, they may become a de‑facto standard for Chinese‑language and cross‑modal AI research, complementing the growing portfolio of non‑English open‑source LLMs. For now, the real test will be whether developers can translate the modest download numbers into production‑grade applications, a process that will likely unfold over the coming months as the models mature and more tooling becomes available.
Sources
This article was created using AI technology and reviewed by the SectorHQ editorial team for accuracy and quality.