Nvidia’s new monster CPU+GPU chip may power the next gen of AI chatbots

 In AI, AI chips, Biz & IT, GH200, grace hopper, machine learning, nvidia, supercomputer

Nvidia’s new monster CPU+GPU chip may power the next gen of AI chatbots

Serving the Technologist for more than a decade. IT news, reviews, and analysis.
NVIDIA's GH200

Enlarge / NVIDIA’s GH200 “Grace Hopper” AI superchip. (credit: Nvidia)

Early last week at COMPUTEX, Nvidia announced that its new GH200 Grace Hopper “Superchip”—a combination CPU and GPU specifically created for large-scale AI applications—has entered full production. It’s a beast. It has 528 GPU tensor cores, supports up to 480GB of CPU RAM and 96GB of GPU RAM, and boasts a GPU memory bandwidth of up to 4TB per second.

We’ve previously covered the Nvidia H100 Hopper chip, which is currently Nvidia’s most powerful data center GPU. It powers AI models like OpenAI’s ChatGPT, and it marked a significant upgrade over 2020’s A100 chip, which powered the first round of training runs for many of the news-making generative AI chatbots and image generators we’re talking about today.

Faster GPUs roughly translate into more powerful generative AI models because they can run more matrix multiplications in parallel (and do it faster), which is necessary for today’s artificial neural networks to function.

Read 6 remaining paragraphs | Comments

It takes a lot of computing power to pretend to be human.

Recent Posts
Contact Us

We're not around right now. But you can send us an email and we'll get back to you, asap.

Not readable? Change text. captcha txt