The War Between ChatGPT & Bing Will Boost Nvidia

The War Between ChatGPT & Bing Will Boost Nvidia Close Now

  • Visitor Views: 179

Description

The War Between ChatGPT & Bing Will Boost Nvidia Sales

It’s a fact that a war between two will benefit the third party. The same goes for OpenAI’s ChatGPT and Microsoft’s new Bing Chat app, whose competition benefits another firm called “Nvidia.”

ChatGPT’s ongoing hype may continue to grow for some time, given that Microsoft now employs the same technology for their Bing App. To scale up the development of this technology, both firms require GPT models.

Commercializing ChatGPT may require as many as 10,000 Graphics processing units (GPUs), and Nvidia appears to be the most likely distributor.

The required training parameters for developing the famous language model increased from 120 million in 2018 to about 180 billion in 2020.

However, no prediction has been received for 2023, but it’s safe to assume that these numbers will only increase to the extent that budget and technology permit.

According to the Nvidia company, in 2020, the GPT model required about 20,000 graphics cards to process training data, and it is expected to exceed 30,000 due to its significant growth.

It will be excellent news for them as a recent study also claim that Nvidia graphics card sales are about to reach horizons.

In the GPU market, other hot sellers like Intel, AMD, and Nvidia exist. So, these calculations are assumptions based on the fact that OpenAI will train its language model using Nvidia’s A100 GPUs, costing $10,000 to $15,000.

Otherwise, OpenAI may go for the brand new H100 cards, whose single card costs around $30,000 or more and has three times greater performance than A100.

Still, the hope is for Nvidia to scale up its profit because it is frequently seen as the go-to solution for AI-related tasks.

Rating