Content
summary Summary

Huawei has publicly denied reports that its Pangu Pro MoE open-source model is a "recycled product" based on work from Alibaba.

Ad

In an official statement from the Huawei Noah's Ark Lab, the company says the Pangu Pro MoE model was developed in-house and trained from scratch on Huawei's Ascend hardware platform. Huawei insists the model was not created through continued training on another provider's model.

These statements come in response to an analysis by "HonestAGI" that was deleted from GitHub but is still accessible via the Wayback Machine. The report, titled "Intrinsic Fingerprint of LLMs: Continue Training is NOT All You Need to Steal A Model!", highlighted strong similarities between Huawei's Pangu Pro MoE and Alibaba's Tongyi Qianwen Qwen-2.5 14B, especially in the distribution of attention parameters.

The now-deleted GitHub analysis, still viewable on the Wayback Machine, compared the internal structure of Huawei's Pangu Pro MoE and Alibaba's Qwen-2.5 14B. | Bild: Github via Wayback Machine

The anonymous GitHub analysis argued that Huawei's model may not have been trained from scratch but instead "upcycled." Huawei rejects this, stressing that while some code uses "industry-standard open-source practices," all open-source components are properly licensed and marked.

Ad
Ad

Accusations go beyond Huawei

The accusations come at a sensitive time for Huawei, which is working to show that its Pangu models - trained on Ascend chips - can compete with global AI leaders. This matters as China seeks alternatives to Nvidia amid increasing US export restrictions.

For China, Huawei's chips are considered a way around tightening US curbs on Nvidia GPUs. The impact of those restrictions is already being felt: the Deepseek R2 model is reportedly delayed due to lack of access to Nvidia H20 chips.

Huawei is promoting its upcoming Ascend 910C as China's top Nvidia alternative. Chip analyst Dylan Patel says the performance gap between the 910C and the formerly US-approved Nvidia H20 is now less than a year, compared to two years with the previous 910B. China still lacks an answer to Nvidia's Blackwell chips, but a scaled-down version for China is reportedly in the works.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Huawei denies claims that its open-source Pangu Pro MoE AI model is a repurposed version of Alibaba's technology, stating the model was independently developed and trained using Huawei's own Ascend chips.
  • The controversy began after an anonymous, now-deleted GitHub analysis highlighted close similarities in the Attention parameters between Huawei's Pangu Pro MoE and Alibaba's Qwen-2.5 14B, suggesting incomplete retraining; Huawei maintains that all open-source components are properly licensed and labeled.
  • The dispute carries political weight, as Huawei aims to demonstrate that its Ascend chips can power leading AI models, which is crucial for China amid increasing US restrictions on Nvidia technology and ongoing efforts toward technological self-sufficiency.
Sources
Matthias is the co-founder and publisher of THE DECODER, exploring how AI is fundamentally changing the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.