- QIA Announces Its Participation in a Series C Funding Round to Enable d-Matrix’s Generative AI Inference Technologies
- The Round Was Led by a Global Coalition of Major Investment Firms Across Europe, North America, Asia, and the Middle East, Including Bullhound Capital, Triatomic Capital, and Temasek
- QIA Joins as a New Investor Alongside EDBI
Qatar Investment Authority (QIA) has participated in a Series C funding round for d-Matrix, a company specializing in generative AI inference solutions for data centers, raising $275 million to support the company’s global expansion plans in advanced AI technologies and to accelerate the widespread deployment of its platforms across hyperscalers, enterprises, and governments.
Following the completion of this round, the total funding raised by d-Matrix reaches approximately $450 million.
The round was led by a global coalition of major investment firms from Europe, North America, Asia, and the Middle East, including Bullhound Capital, Triatomic Capital, and Temasek, along with new investors such as QIA and EDBI.
The round also included continued participation from previous investors, including M12 (Microsoft’s Venture Fund), Mirae Asset, Industry Ventures, and Nautilus Venture Partners.
d-Matrix’s fully integrated inference platform combines advanced compute-in-memory architecture, high-speed interconnects, and optimized inference software, delivering 10x faster performance, 3x lower cost, and 3–5x better energy efficiency compared to conventional GPU-based systems.
This enables data centers to handle significantly larger AI workloads more efficiently, while delivering profitable AI services to customers—addressing the growing challenges of AI sustainability.
Sid Sheth, CEO and Co-Founder of d-Matrix, commented:
“This funding supports our vision as the industry enters a new era of AI inference.”
Jeff Huber, General Partner at Triatomic Capital, said:
“AI inference has become the dominant cost in AI production systems. d-Matrix has achieved scalable performance and sustainable economics, Their digital in-memory computing architecture is purpose-built for the low-latency, high-throughput inference workloads that matter most.”
He added:
“d-Matrix is redefining what is economically possible in AI infrastructure.”
Michael Stewart, Managing Partner at M12, Microsoft’s venture fund, said:
“The accelerating demand for AI inference means that efficiency and scalability are now critical to revenue generation and profitability for hyperscalers and AI factories”.
HE Add:
“d-Matrix is the first AI chip startup to address the current unit economics of LLM inference across the fastest-growing model sizes, thanks to its differentiated in-memory compute architecture which preserves total cost of ownership advantages while improving latency and throughput.”
العربية (Arabic) To read the article in Arabic, click here
