Alibaba Advances AI Ambitions with New Domestic AI Chip Amid Global Tech Tensions

In a major step toward technological self-reliance, Chinese tech giant Alibaba has unveiled a new AI accelerator chip designed for cloud computing and artificial intelligence inference tasks. This latest chip initiative aligns with China’s broader national strategy to reduce reliance on U.S. semiconductor technology, notably Nvidia’s high-end AI chips, amid ongoing geopolitical and trade tensions.
Alibaba’s cloud computing division, Alibaba Cloud, has been quietly advancing its capabilities in AI hardware for several years. Earlier designs like the 2019 Hanguang 800 chip targeted conventional machine learning models but lacked the versatility required for today’s large language models and diffusion models used widely in generative AI applications. The newly developed chip breaks from this earlier narrow focus. It supports a broader variety of AI inference workloads — the phase where trained AI models are deployed to generate predictions or outputs based on new data.
Ad Content
Importantly, this new processor is optimized for AI inference rather than training. While model training remains highly computational and typically requires the most powerful and expensive GPUs — like Nvidia’s top-tier H100 or H20 processors — inference demands lower resources and enables AI models to be served at scale in practical deployments. Alibaba’s decision to focus domestically developed silicon on inference allows an impactful entry point into the chip space without immediately tackling the toughest challenges of training at scale.
The chip also reportedly features more onboard memory than Nvidia’s current H20 chip, boosting its ability to handle more extensive operations, though this comes with increased power consumption. Its design emphasizes compatibility with common AI software frameworks such as PyTorch and TensorFlow, ensuring that engineers can repurpose existing AI codebases relatively easily. This approach may ease integration hurdles and accelerate the deployment of Alibaba’s AI infrastructure.
Notably, the new AI chip is manufactured domestically within China due to U.S. export restrictions on advanced semiconductor fabrication and certain AI chip technologies. These restrictions have limited Chinese companies’ access to cutting-edge fabrication foundries like Taiwan Semiconductor Manufacturing Company (TSMC) and Samsung Electronics. Industry insiders suggest that China’s Semiconductor Manufacturing International Corporation (SMIC) is the most likely fabricator, reflecting the country’s concerted push to build up its domestic semiconductor ecosystem.
This development surfaces at a time when the Chinese government is actively urging big tech firms to move away from Nvidia’s AI accelerators amid concerns about supply chain risks and security. Earlier in 2025, China had called on companies to avoid Nvidia’s H20 GPUs despite a temporary U.S. easing of export controls. This geopolitical backdrop has incentivized local AI leaders like Alibaba to accelerate investment in homegrown alternatives to sustain the country’s AI ambitions.
Beyond Alibaba, other Chinese startups and corporations are pursuing similar paths to establish competitive AI hardware solutions. For example, Shanghai-based MetaX has released an AI chip integrating older tech with new memory configurations to compete with Nvidia’s offerings. Huawei continues to push its Ascend series of neural processing units (NPUs), some of which analysts claim match or even surpass Nvidia chips in certain benchmarks, albeit often with significantly higher energy costs.
Alibaba’s AI efforts extend beyond hardware. The company has made strides in AI software and model development, launching the Qwen3 series of open AI models earlier in 2025. Coupled with its emerging chip architecture, Alibaba aims to create an integrated AI ecosystem capable of competing with global leaders like Nvidia, Google, and OpenAI.
Financially, Alibaba’s cloud computing division is a powerhouse, with reported revenue growth of 26% in the latest quarter, driven by increased adoption of cloud intelligence services. The company has publicly committed to investing billions into expanding both cloud and AI infrastructure over the coming years, emphasizing AI as a critical growth pillar.
Despite these advances, challenges remain. Domestic chip fabrication still lags behind international leaders in terms of maturity and efficiency. Memory bandwidth and power consumption constraints could limit the new chip’s competitiveness in high-end applications. Additionally, AI model training — a resource-intensive task where Nvidia currently dominates — is likely to remain dependent on foreign technology in the near term.
Alibaba’s new AI chip signals China’s accelerating push to establish strategic independence in the AI hardware space. By focusing on inference chips compatible with widespread AI frameworks and produced domestically, Alibaba combines technological pragmatism with national policy imperatives. As trade tensions continue and U.S. export controls persist, such homegrown innovations are poised to play a key role in shaping the future of AI in China and the global tech landscape.
Enjoyed this post?
Subscribe to Evervolve weekly for curated startup signals.
Join Now →