According to CNBC, Loop Capital has maintained its buy rating on Nvidia while dramatically increasing its price target from $250 to $350 per share, representing a potential 73% upside from current levels. Analyst Ananda Baruah projects Nvidia will double its GPU shipments over the next 12 to 15 months, reaching 2.1 million units by the January quarter of 2026, with average selling prices expected to expand simultaneously. Baruah described the current environment as the next “Golden Wave” of generative AI adoption, with Nvidia positioned at the forefront of stronger-than-anticipated demand, though he noted risks including real estate and power constraints plus potential legislation impacting AI revenue generation. The analyst further suggested that $400 per share could ultimately be within Nvidia’s sights as investors digest the company’s 2027 outlook, while current data shows Nvidia shares have already rallied 51% this year with 60 of 66 covering analysts rating it a buy or strong buy.
The Coming Infrastructure Wall
While the shipment projections appear ambitious, they highlight a critical bottleneck that could reshape the entire AI industry. Doubling GPU shipments to 2.1 million units represents not just manufacturing capacity but deployment capability. Each high-end AI GPU consumes substantial power—Nvidia’s Blackwell architecture GPUs reportedly draw 1000-1200 watts each—meaning 2.1 million additional units would require approximately 2.5-3 gigawatts of continuous power, equivalent to powering 2 million homes. This creates a fundamental constraint that goes beyond Nvidia’s control and could force major cloud providers and AI companies to rethink their data center strategies, potentially slowing actual deployment even if manufacturing targets are met.
Competitive Responses and Market Fragmentation
The projected growth assumes Nvidia maintains its current dominance, but competitors are mobilizing aggressively. AMD’s MI300 series and Intel’s Gaudi 3 represent increasingly viable alternatives, while cloud providers like Google, Amazon, and Microsoft are accelerating their custom silicon development. What’s particularly telling is that Baruah’s analysis focuses entirely on the Blackwell cycle ramp, suggesting Nvidia’s technological lead remains secure for now. However, the industry is witnessing early signs of fragmentation as customers seek to avoid single-vendor dependency, especially given Nvidia’s premium pricing. The projected ASP expansion could actually accelerate this trend by making alternatives more economically attractive for certain workloads.
The Regulatory Overhang
The mention of legislation impacting AI revenue generation points to a significant wildcard that most analysts underestimate. We’re seeing early regulatory movements in both the EU and US that could restrict certain AI applications or impose additional compliance costs. The EU AI Act already establishes risk categories that could limit deployment of certain generative AI models, while US export controls on advanced AI chips to China continue to evolve. These regulatory headwinds could materially impact the total addressable market, particularly as governments become more concerned about AI safety and strategic competition.
Broader Investment Implications
For investors, the Nvidia story has become a proxy for the entire AI infrastructure market. The projected growth suggests continued strength across the semiconductor ecosystem, from TSMC’s advanced packaging capabilities to memory suppliers like Micron and SK Hynix. However, the concentration risk is becoming apparent—if Nvidia stumbles, the entire AI infrastructure narrative could face challenges. The more interesting investment thesis may lie in the companies addressing the constraints Baruah identified, including power management specialists, data center real estate providers, and cooling technology innovators who stand to benefit regardless of which AI chips ultimately dominate.
