AsianFin -- Chinese researchers have introduced a new brain-inspired large language model (LLM) that they say can deliver more energy-efficient reasoning than mainstream systems such as OpenAI’s ChatGPT and Google’s BERT.
The model, called SpikingBrain, was developed by a team led by Li Guoqi and Xu Bo at the Institute of Automation under the Chinese Academy of Sciences (CAS), according to a recently published study. Training was carried out on hundreds of graphics processing units provided by Shanghai-based chipmaker MetaX.
Unlike transformer-based architectures that power today’s dominant LLMs, SpikingBrain uses event-driven spiking neurons, which simulate the adaptive and energy-efficient signaling of the human brain. This design enables training on significantly smaller datasets while consuming fewer computational resources.
“SpikingBrain represents a non-transformer path for AI development,” said Xu Bo, director of the institute, in comments to Xinhua News Agency. “It might inspire the design of next-generation neuromorphic chips with lower power consumption.”