Jensen Huang: AI hardware performance has increased a millionfold in 10 years
NVIDIA CEO Jensen Huang meets with UK Prime Minister Kiir Starmer as part of the opening of London Tech Week. The event discussed the future of artificial intelligence and its impact on the economy. Huang said that over the past decade AI systems' performance has increased a million-fold, emphasizing the pace of technological progress. Although the head of NVIDIA did not specify whether he meant hardware or software advances, everything points to an increase in power GPU and data centers.
An example of such a leap is the new generation of chips. Blackwell B200, which delivers up to 20 teraflops of inference—while GPU The 100 P2016 delivered just 19 teraflops of FP16. Blackwell, meanwhile, 42 times more energy efficient when processing each token. While this data does not provide an exact 1:1 ratio, it does support the idea that the overall performance of modern AI clusters is indeed many times greater than the performance of ten years ago.
At the same time, the UK is still lagging far behind the US in terms of infrastructure. The country is building a system Isambard-AI with 5500 Grace Hopper accelerators, while the American xAI already uses 200 Hoppers and plans to create a cluster on million GPU Blackwell. However, NVIDIA will invest in local development: the company will open in Britain AI Research Center, will support projects in robotics, environmental modeling and 6G, and will launch educational program on AI for developers. The focus will be on upgrading skills, collaborating with government agencies, and creating a secure environment for testing AI in the financial sector.