NEWS  /  Brief News

Huawei Open-Sources Pangu 7B Dense and 72B MoE Models

Jun 30, 2025, 4:09 a.m. ET

AsianFin — Huawei has officially open-sourced its 7-billion-parameter dense model and the 72-billion-parameter Pangu Pro MoE (Mixture-of-Experts) model, along with model inference technology based on its Ascend AI infrastructure.

Key updates include:

  1. The model weights and core inference code for the Pangu Pro MoE 72B model are now available on open-source platforms.

  2. Huawei has also released large-scale MoE inference code optimized for its Ascend chip architecture.

  3. The weights and inference code for the Pangu 7B dense model will be made available shortly.

This move signals Huawei’s deepening commitment to open AI ecosystems, particularly in advancing large-scale model performance through domestic computing frameworks.

Please sign in and then enter your comment