Nvidia CEO Backs Data Center Overhaul, Generating AI Revolution - InvestorDaily
Technology

Nvidia CEO Backs Data Center Overhaul, Generating AI Revolution

 

Speaking at the Communacopia + Technology conference in San Francisco, Huang criticized traditional “gigantic” data centers as highly inefficient.

“These giant data centers are super inefficient because [they are] full of air, and air is a lousy conductor of electricity,” Huang said.

He advocated for compact, energy-efficient data centers that use advanced technologies such as liquid cooling, which are challenging to implement in sprawling facilities.

“What we want to do is take that few, you know, call it a 50-, 100-, 200-megawatt data center that’s stretched out and condense it into a really, really small data center.” , he said, stressing that modernization efforts could take the next decade.

Despite the high initial cost of Nvidia’s server racks, which can run “several million per rack,” Huang says their performance justifies the investment.

“The amazing thing is that just the cables to connect old general purpose computing systems cost more than replacing all of those and compacting them into one rack,” he said.

Generative AI and accelerated computation

Huang also noted the potential of generative AI to redefine industries, describing it as more than a tool.

“This generative AI is not just a tool, it’s a skill,” he said, predicting that AI will expand beyond traditional data centers and IT into the creation of digital skills, such as virtual drivers.

“For the first time, we will create skills that enrich people. That’s why people think AI will expand beyond the trillion dollars of data centers and IT and into the world of skills. So what is a skill? Digital driver is a skill,” he said.

Nvidia’s focus on accelerated computing promises immediate return on investment (ROI), Huang said, by improving virtual data center utilization.

Virtualization allows workloads to move seamlessly across the data center rather than being tied to specific machines, reducing costs. Cloud hosting further optimizes resources by allowing multiple companies to share infrastructure.

“The returns on this are fantastic because the demand is so high,” Huang said. “Every dollar they spend with us turns into $5 worth of rent… So the demand for that is just incredible.”

Huang emphasized that the future of computing is in systems, not individual chips.

“The way computers are built today … we designed seven different types of chips to create the system. Blackwell is one of them.

“So the amazing thing is that when you want to build this artificial intelligence computer, people say words like ‘supercluster’, ‘infrastructure’, ‘supercomputer’ for good reason, because it’s not a chip, it’s not a computer per se . So we’re building entire data centers,” he said.

Vision of digital engineers

Huang also predicts a future where software engineers collaborate with AI-powered “digital companion engineers,” effectively multiplying productivity.

The days of software engineers writing every line of code are “totally over,” he said.

In Nvidia’s vision, 32,000 employees will eventually be supported by 100 times as many digital engineers.

“So the way I look at Nvidia, we have 32,000 employees, but those 32,000 employees are surrounded by hopefully 100 times as many digital engineers,” Huang said.

Nvidia is due to report its third-quarter results this week, prompting analysts to question whether the company will set the stage for a strong end to the year for the tech giants — or in the final weeks of 2024. the market will deliver its impressive year-to-date gains.

 

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *