Nvidia CEO unveils new tech to keep global AI expansion going
Sign up now: Get ST's newsletters delivered to your inbox
Nvidia CEO Jensen Huang displays the 'Grace Blackwell NVL72 GPU' during his keynote speech at the COMPUTEX 2025 in Taipei, Taiwan, on May 19.
PHOTO: EPA-EFE
Follow topic:
NEW YORK – Nvidia unveiled the latest raft of technologies aimed at sustaining the boom in demand for AI computing – and ensuring that its products stay at the centre of the action.
Chief executive Jensen Huang on May 19 kicked off Computex, Asia’s biggest electronics forum, in Taiwan, touting new products and cementing ties with a region vital to the tech supply chain. His company’s shares are riding a fresh rally following a dealmaking trip to the Middle East as part of a trade delegation led by President Donald Trump.
Now back in his native Taiwan, Mr Huang introduced updates to the eco system around Nvidia’s accelerator chips, which are key to developing and running AI software and services. The central goal is to broaden the reach of Nvidia products and eliminate barriers to AI adoption by more industries and countries.
“When new markets have to be created, they have to be created starting here, at the center of the computer ecosystem,” Mr Huang said about the island.
He opened with an update on timing for Nvidia’s next-generation GB300 systems for artificial intelligence workloads, which he said are coming in the third quarter of this year. They’ll mark an upgrade on the current top-of-the-line Grace Blackwell AI systems, which are now being installed by major cloud service providers.
The chipmaker is offering a new version of complete computers that it provides to data center owners. NVLink Fusion products will allow customers the option to either use their own central processor units with Nvidia’s AI chips or use Nvidia’s CPUs with another provider’s AI accelerators.
To date, Nvidia has only offered such systems built with its own components. This opening-up of its designs – which include crucial connectivity components that ensure a high-speed link between processors and accelerators – gives Nvidia’s data centre customers more flexibility and allows a measure of competition while still keeping Nvidia technology at the centre.
Major customers such as Microsoft and Amazon.com are trying to design their own processors and accelerators, and that risks making Nvidia less essential to data centers.
MediaTek, Marvell Technology and Alchip Technologies will create custom AI chips that work with Nvidia processor-based gear, Mr Huang said. Qualcomm and Fujitsu plan to make custom processors that will work with Nvidia accelerators in the computers. BLOOMBERG

