AI Gold Rush Also Boosts Networking
In its recent earnings call for its first fiscal quarter, Marvell highlighted the role AI is having in propelling its data-center business. Seeing the value in associating with AI, the company reviewed its prior fiscal-year revenue and concluded $200 million came from AI.
Joseph Byrne
One of the biggest beneficiaries of California’s 1849 gold rush wasn’t a miner, but Levi Strauss, who famously sold the miners clothing. Likewise, Marvell is making money from the AI boom by selling collateral products. At the same time, Nvidia remains the prime AI-accelerator supplier, growing its business even as the industry has sought alternatives.
In its recent earnings call for its first fiscal quarter, Marvell highlighted the role AI is having in propelling its data-center business. Seeing the value in associating with AI, the company reviewed its prior fiscal-year revenue and concluded $200 million came from AI. This was mostly from its optical DSPs, the physical-layer devices required to make high-speed networks (MPR Mar 2023, “Marvell Switches On Teralynx 10”). The company expects its AI-attributable revenue to double in the current fiscal year and to double again in the next. Growth will come from transceivers, switches, security ICs, and custom/semicustom designs. The latter includes a couple of projects expected to ramp up next year.
Demand for AI has a more direct effect on Nvidia’s business. In its recent earnings call, Nvidia reported data center sales topped $4 billion, up 14% over the prior year and topping Intel’s 1Q23 Data Center and AI (DCAI) revenue. Nvidia’s highlights for the quarter include deployments of its massive H100 (Hopper) AI accelerators at hyperscalers including Microsoft Azure, Google Cloud, Oracle Cloud, AWS, and Meta. At the same time, generative AI and large language models (LLMs) are gaining traction within enterprises alongside use of AI for other applications. Nvidia expects whole-company revenue this quarter to be $11 billion, up from $7 billion last quarter, boosted in large part by generative AI and LLMs.
With targeted data center operators, Marvell found itself in a fortuitous position as AI has fueled another wave of capital investment. Clusters for AI demand the fastest networking technology, and Marvell is among the few companies that can supply it. Also fortuitously, the company assembled an ASIC business mainly for telecom customers but is now positioned to work with hyperscalers demanding customized solutions.
Conspicuously absent, Marvell’s portfolio doesn’t include any standard chips for AI training and inference even as companies large and small have sought to dislodge Nvidia. None of these attempts have succeeded despite sizable investment. Meanwhile, Nvidia’s business is surging, enabling it to invest more in developing the hardware and software that will keep it the market leader. Marvell is demonstrating that a good strategy isn’t defined only by the opportunities a company chooses to pursue but also by the ones it chooses not to.
Free Newsletter
Get the latest analysis of new developments in semiconductor market and research analysis.
Subscribers can view the full article in the TechInsights Platform.
You must be a subscriber to access the Manufacturing Analysis reports & services.
If you are not a subscriber, you should be! Enter your email below to contact us about access.