While nearly every venture capital firm is rushing headlong into what many have dubbed the “Year One of AI Applications,” betting aggressively on copilots, agents, and vertical software startups, one young fund has chosen to swim decisively against the current. Rather than compete for crowded deals at ever-rising valuations, it has concentrated all its firepower on a far more foundational question: what does the AI factory of the future actually look like?
That fund is 3C AGI Partners. Founded just two years ago, it has pursued a sharply contrarian strategy—one centered not on applications, but on the reinvention of the underlying compute infrastructure that makes artificial intelligence possible. In doing so, it has achieved something few Chinese venture capital firms have managed: securing stakes in two of Silicon Valley’s most closely watched chip and computing power companies, Cerebras and Groq, and becoming the only Chinese VC to appear on the shareholder lists of both.
The timing now looks almost uncanny. As this article goes to press, OpenAI has formally announced a partnership with Cerebras, committing to purchase 750 megawatts of computing capacity in a deal reportedly worth $10 billion. Foreign media have also disclosed that Cerebras, which plans to go public in 2026, is seeking to raise $1 billion at a valuation of around $22 billion. Meanwhile, news that Nvidia is set to acquire Groq for $20 billion by the end of 2025 has electrified the global technology community. Within just 24 months, these two investments alone have delivered returns of more than six times and over ten times respectively for 3C AGI Partners.
“How did you manage to secure stakes in companies like these, at such valuations?” Even the fund’s own limited partners have asked the question in disbelief.

At the center of this story is Esther Wong Kangman, founder and managing partner of 3C AGI Partners—a figure whose career path reads less like a straight line and more like a double helix, spiraling between Wall Street finance and frontier artificial intelligence.
Wong’s professional life spans both oceans and industries. Armed with an EMBA from the University of Chicago Booth School of Business and dual bachelor’s degrees in political science and economics from Stony Brook University—along with a minor in physics—she spent two decades in senior roles at institutions including J.P. Morgan, CICC, Barclays, and BOCOM International. For years, she lived at the heart of global capital markets.
Then, in 2017, she did something few of her peers expected. Pregnant and well established in investment banking, Wong walked away from stability to join SenseTime, then a fast-growing but still uncertain AI startup. There, she built the company’s investment and financing team and helped lead its fundraising journey from Series B all the way to IPO, raising a total of $6 billion along the way.
When ChatGPT ignited a global reckoning with artificial intelligence in 2023, Wong pivoted once again. Drawing on her experience as both financier and operator, she founded 3C AGI Partners, convinced that the next great bottleneck in AI would not be algorithms or applications, but infrastructure—compute, energy, and systems.
By the end of 2025 and the beginning of 2026, that conviction had translated into a string of deals that placed 3C squarely at the center of the global computing power race.
Wong’s relationship with Cerebras predates the current AI boom by years. She first met its founder, Andrew Feldman, in 2018 at a friend’s gathering. At the time, Feldman spoke openly about his ambition to challenge Nvidia, arguing that GPUs were fundamentally ill-suited for training the next generation of foundation models.
“He was already talking about foundation models before most people even knew what that meant,” Wong recalled. “This was right at the dawn of Transformers.”
Cerebras’ approach was radical. Instead of pursuing ever-smaller transistors and lower power consumption, the company went in the opposite direction—fabricating an entire 12-inch silicon wafer as a single chip, integrating 44GB of SRAM directly onto it. The result was a processor 56 times larger than a conventional GPU, designed to eliminate the data transfer bottlenecks that plague distributed systems.
When Wong launched 3C AGI Partners at the end of 2023, one of her first calls was to Feldman. He told her Cerebras had just signed a major deal with G42 in the Middle East, supplying AI data center solutions in which Cerebras and Nvidia hardware each accounted for half of the deployment.
That moment crystallized her conviction. With a valuation negotiated on the basis of long-standing trust, Cerebras became one of 3C’s flagship investments.
Today, Cerebras is widely seen as one of the few genuine “Nvidia challengers,” with real commercial traction across training and inference. Its inference systems have demonstrated performance tens of times faster than large-scale GPU-based cloud solutions, while offering compelling total cost of ownership over time.
If Cerebras represents a reimagining of training infrastructure, Groq embodies a similar rethink of inference.
Founded by Jonathan Ross, the architect behind Google’s first-generation TPU, Groq was built around a “software-first” philosophy—designing hardware by working backward from the desired computational outcome. Long before large language models entered the mainstream, Ross was arguing that inference, not training, would ultimately define AI’s mass adoption.
For years, that vision proved difficult to monetize. Selling standalone chips required customers to overhaul their software stacks and absorb millions of dollars in trial costs—an unrealistic proposition when inference demand was still nascent.
Wong followed Groq closely but held off investing. “I believed in their philosophy,” she said, “but I didn’t see how it would work commercially at that time.”
That changed in 2024, when two things happened. First, Wong became convinced that AI was entering what she calls the “2.0 era”—moving out of the lab and into everyday use, making inference central. Second, Groq pivoted its business model from selling hardware to offering cloud-based inference tokens. The cost of experimentation dropped from millions of dollars to mere cents.
Within 18 months, GroqCloud attracted more than two million developers.
3C invested soon after. Less than a year later, Nvidia’s $20 billion acquisition of Groq validated the thesis at a global scale.
Nvidia’s decision to acquire Groq, while leaving Cerebras independent, underscores a deeper shift in the AI chip landscape.
“Nvidia already owns training,” Wong said. “That’s unshakeable. But if AI is going to become universal, inference is where the real battle will be fought.”
Groq fills a critical gap in Nvidia’s portfolio: ultra-fast, deterministic inference capable of supporting real-time applications such as robotics and translation. Its architecture sidesteps the memory bottleneck entirely, offering speed and stability that traditional GPU designs struggle to match.
The acquisition also strengthens Nvidia’s software moat and accelerates its transition from selling chips to delivering full-stack AI factories.
For Cerebras, paradoxically, the deal is good news. It sharpens the market’s understanding that training and inference are distinct problems—and that radically different architectures can command premium valuations in Silicon Valley.
At the heart of 3C’s strategy is a single organizing idea: redefining the AI factory.
To Wong, AI infrastructure is not synonymous with buying GPUs or building conventional data centers. It is about rethinking compute, energy, cooling, and deployment from first principles.
That philosophy underpins 3C’s three core investment themes.
The first is optimizing inference-centric compute ecosystems, spanning chips, optics, and networking.
The second is reimagining where and how AI factories exist. One of 3C’s boldest bets is Starcloud, a company incubated by the fund to build space-based data centers. Last year, Starcloud successfully launched Nvidia H100 GPUs into orbit. Its long-term ambition: a 5-gigawatt, solar-powered orbital data center.
The third theme is energy. As AI drives exponential growth in power demand, Wong believes traditional energy systems will fall short. Nuclear fusion, in her view, is the ultimate answer. Through a joint investment with Bill Gates’ energy fund, 3C has backed Type One Energy, which is already working with the Tennessee Valley Authority on a commercially viable fusion plant.
Wong is blunt about the current rush into AI applications.
“Most of them won’t matter in the long run,” she said.
She draws an analogy to the early internet era, when countless popular services rose and vanished before Google emerged. In her framework, AI is still building its rocket. Designing applications for today’s limited inference speeds, she argues, is like optimizing bicycles before rockets exist.
In her vision of AI 3.0, the primary users of AI will be other AI agents—systems that understand human preferences, orchestrate models autonomously, and operate continuously in real time. That world demands inference that is orders of magnitude faster and cheaper than today’s.
“When the rocket is ready,” she said, “the real applications will design themselves.”
How did a two-year-old fund from China secure a place on some of Silicon Valley’s most coveted cap tables?
Wong’s answer is simple, but demanding: deep understanding, long-term trust, and the courage to stay non-consensus.
3C’s team comes from the industry, not from trend-chasing. Many of its founder relationships were built years before those founders became famous. The fund offers more than capital—it offers practical help, technical judgment, and patience.
“Great projects are never short of money,” Wong said. “They are short of trust.”
That trust, she believes, can only be earned by those who have built something themselves—and survived failure.
As markets debate when the next killer AI application will emerge, 3C AGI Partners has already planted its flag deep in the infrastructure that will make such applications inevitable.
For Wong Kangman, redefining the AI factory is not a detour from innovation—it is the road itself.
“Native AI applications may belong to the era of AI 3.0,” she said. “But before that, someone has to build the road.”
Two years in, the harvest season for this young fund may only just be beginning.


