By Adam O'Dell, Banyan Hill Publishing, 2024-05-10
It was the kind of contraption only a mad scientist could love…
A rack of massive cylinders connected to a dense cluster of hoses and tubes.
The machine towered over Mythbusters hosts Adam Savage and Jamie Hyneman.
As Savage explained: “When we hit THIS trigger on this thing, 2,100 gallons of compressed air goes through these valves, out these accumulators and into all 1,100 of these tubes…”
Well, you should see it for yourself:
In just 80 milliseconds, the Mythbusters' machine painted the Mona Lisa in front of a live audience.
The demonstration was organized by Nvidia Corp. as a way to show the difference between its graphics processing units (GPUs) and the central processing units (CPUs) that power most computers.
CPUs tend to use one core processor to complete a variety of tasks.
The Mythbusters represented this with a single automated paint cannon, slowly sketching out a smiley face one “pixel” at a time:
Meanwhile, Nvidia's GPUs have thousands of core processors called “CUDA cores.” These cores aren't as powerful on their own — but Nvidia's latest cards have over 16,000 of them!
So instead of rendering one “pixel” at a time, a GPU can render an entire frame in a single blast.
Nvidia's Stranglehold on AI Hardware
When it comes to the actual hardware that will enable the next phase of the AI revolution, Nvidia has a truly massive advantage.
An advantage that's worth far more than the treasure trove of research capital owned by its competitors.
Because, in the words of one former Nvidia developer: “Money is just a transient commodity that has not been converted into GPUs yet.”
This isn't just a happy accident for Nvidia, either.
For over a decade now, the company has been working continuously toward AI dominance … optimizing each new generation of chips for deep learning and generative AI execution.
Nvidia has even developed specialized software to help boost GPU performance even further, with 8X stronger performance and a 5X reduction in energy consumption.
That's why the company's GPUs have become “the dominant platform for accelerating machine learning workloads,” according to a study from independent research group Epoch.
ChatGPT — the world's most successful large language model (LLM) — uses Nvidia GPUs to run the generative AI services used by more than 100 million people.
And virtually every other major AI project relies on GPUs as well.
As one RedHat blogger perfectly summarized: “GPUs have become the foundation of artificial intelligence.”
AI Developers Face New “Invisible” Challenge
I've recently discovered a critical bottleneck in the development of next-generation AI programs.
In order for growth to continue, Big Tech and AI creators will be forced to resolve this issue … delivering up to 10X gains for those in the position to solve it.
This is a very serious situation … yet it's still completely “off the radar” for most companies and investors.
So I've decided to host a special AI Power Summit to show you how this short-term crisis is actually a long-term opportunity…
One that will give those who missed out on AI's biggest gains this last year can get a second chance to target 10X gains in the next 12 months.
Most importantly, I'll share why it's so urgent that you take advantage now … before an announcement scheduled just days from now.
To be clear, this is a must-attend event for anyone looking to grab hold of the next round of AI profits set to take place.
This special video presentation just went live, so take a moment to watch immediately by going HERE.
To good profits,
Adam O'Dell
Chief Investment Strategist, Money & Markets