Get insights on thousands of stocks from a global community of over 7 million individual investors on Simply Wall St.
-
Broadcom is expected to dominate the custom AI chip market as hyperscalers expand in-house AI chip projects.
-
According to reports, Microsoft, Google, Amazon and Meta are the main customers of Broadcom’s custom AI server computing ASIC.
-
It is said that Broadcom will occupy about 60% of the market share in the field of artificial intelligence server computing ASIC by 2027, and ASIC shipments are expected to triple.
-
The company is a key supplier of Microsoft’s Maia artificial intelligence chip program and is associated with years of building artificial intelligence infrastructure.
For investors following Nasdaq GS:AVGO , the focus is Broadcom’s behind-the-scenes role in powering hyperscale data centers. The company is at the center of custom AI chip efforts by Microsoft, Google, Amazon and Meta, and those relationships are directly tied to infrastructure spending. Broadcom’s current stock price is US$333.24, and its 3- and 5-year returns are very large, which has attracted widespread attention from the market.
The AI server computing ASIC market share is expected to reach 60% by 2027, and ASIC shipments are expected to triple, underscoring the scale of AI-related demand for Broadcom’s location services. If hyperscalers continue to prioritize in-house AI chips for performance and cost control, Broadcom’s role as a design and manufacturing partner may remain important to data center construction plans and long-term hardware roadmaps.
Add Broadcom to your watchlist or portfolio to stay up to date on Broadcom’s most important news stories. Or, explore our community and discover new perspectives on Broadcom.
How Broadcom stacks up against its biggest rivals
For Broadcom, being predicted to be a leading custom AI chip partner for hyperscale enterprises is directly related to how to build AI-centric data centers, as custom application-specific integrated circuits (ASICs) are a way for Microsoft, Alphabet, Amazon, and Meta to tailor cost and power consumption to their own models, rather than just relying on general-purpose GPUs from the likes of Nvidia and AMD. If hyperscalers expand these internal AI programs as labeled, Broadcom’s role in providing accelerators and Ethernet connectivity could deepen its presence across the entire AI rack, not just individual chips.
This expected leadership in custom AI chips is consistent with the existing narrative that emphasizes Broadcom as a major provider of AI accelerators and networks for large-scale cloud and AI platforms, while also relying on its infrastructure software portfolio. News about Broadcom’s work related to Microsoft’s Maia AI chips, Google’s Tensor processors, and the broader AI-specific backlog support the idea that custom chip demand is a core driver of the more optimistic and cautious AVGO narrative you’re seeing.