Penguin Solutions (PENG) Q2 2026: Memory Segment Surges 63%, Anchoring AI Infrastructure Pivot

Penguin Solutions’ Q2 revealed a decisive pivot toward AI infrastructure, with the integrated memory segment delivering standout growth and driving a raised full-year outlook. New CEO Kash Shaikh is accelerating investments in AI factory platforms, leveraging the company’s unique memory and compute integration to target inference workloads. While advanced computing faces near-term headwinds from hyperscale and Penguin Edge wind-downs, the business is rapidly diversifying its customer base and deepening its alignment with secular AI trends.

Summary

  • Memory-Driven Upside: AI-fueled memory demand and favorable pricing propel segment growth and lift full-year guidance.
  • AI Factory Focus: Strategic investments and new product launches position Penguin at the intersection of compute and memory for inference workloads.
  • Customer Diversification: Rapid shift away from hyperscale dependency as enterprise, neocloud, and sovereign wins build long-term resilience.

Performance Analysis

Penguin Solutions’ Q2 results underscore a strategic reweighting of the business model toward AI infrastructure and memory integration. The integrated memory segment, now half of total net sales, grew 63% year over year, benefiting from robust AI-driven demand and favorable pricing across telco, networking, and computing. This strength more than offset continued declines in advanced computing, which fell sharply due to the deliberate wind-down of hyperscale and Penguin Edge exposure.

Gross margin improved modestly year over year and sequentially, reflecting product mix and pricing tailwinds in memory, as well as tariff recovery in LED. However, management signaled that gross margins will compress in the back half as lower-margin AI hardware and rising memory costs dilute the mix. Operating expenses remained tightly managed, with increased R&D investment in clusterware software and memory AI solutions offset by cost discipline elsewhere. Cash flow from operations was healthy, though working capital rose to support higher inventory for anticipated second-half demand.

  • Segment Mix Shift: Memory now accounts for 50% of revenue, dwarfing advanced computing’s 34% share as the business pivots away from legacy hardware.
  • Bookings Momentum: Non-hyperscale AI HPC bookings up significantly, with five new customer logos in Q2 and seven in the first half, compared to three last year.
  • Capital Allocation: $32 million deployed for share repurchases, with a net cash position and no major debt maturities until 2029.

The quarter’s results reflect both the cyclical strength of memory and the early returns from Penguin’s AI-centric strategy, but also highlight the transitional volatility in advanced computing as legacy revenue streams are phased out.

Executive Commentary

"AI is moving from experimentation to production, with workloads increasingly shifting towards real-time inference. We are already seeing this translate into customer demand beyond hyperscale across enterprise, neocloud, and sovereign AI markets. We expect this transition to expand our addressable market and drive increased demand for integrated AI infrastructure, where Penguin is already winning."

Kash Shaikh, Chief Executive Officer

"In the quarter, Total Penguin Solutions net sales were $343 million, down 6% year over year. Non-GAAP gross margin came in at 31.2%, which was up 0.4 percentage points versus Q2 last year... Our full year net sales outlook reflects the following full year growth ranges by segment. For advanced computing, we now expect full year net sales to change between minus 25 and minus 15% year over year. For memory, we now expect net sales to grow between 65% and 75% year over year, driven by strong demand and a favorable pricing environment."

Nate Olmstead, Chief Financial Officer

Strategic Positioning

1. Memory as a Durable AI Lever

Penguin’s integrated memory business is now the company’s primary growth engine, with AI inference workloads fueling both volume and pricing upside. The launch of the Memory AI server—leveraging Compute Express Link (CXL), a high-speed interconnect enabling shared memory across CPUs and GPUs—demonstrates Penguin’s move up the value chain from commodity modules to differentiated, solution-oriented platforms. Management stressed that AI is creating a “more durable layer of demand” for memory, counterbalancing typical cyclical swings.

2. AI Factory Platform Expansion

The company’s AI factory platform strategy integrates cluster management software, advanced computing systems, scalable memory, reference architectures, and end-to-end services, targeting the full stack needs of production-grade AI deployments. Recent product launches, such as the CXL-based KV cache server for large language model (LLM) inference, are tailored for low-latency, high-memory environments—an increasingly critical requirement as agentic AI adoption grows.

3. Customer Base Diversification

Penguin is actively de-risking its business by reducing hyperscale concentration, expanding into enterprise, neocloud, and sovereign AI segments. The advanced computing segment’s non-hyperscale AI HPC revenue grew 50% in the first half, now representing over 40% of the segment’s mix (up from 20% last year). Five new AI HPC logos in Q2 highlight traction across financial services, biomedical research, and energy, underpinning a more resilient and repeatable revenue base.

4. Early Bets on Next-Gen Memory

Penguin’s early investment in photonic memory appliances (PMA) and partnership with Celestial AI (now Marvell) position the company to address future memory scaling bottlenecks in next-generation AI systems. Management views photonic interconnects as a key enabler for future-proofing AI infrastructure and expects to see incremental revenue from these solutions as inference workloads proliferate.

5. Operational Discipline and Capital Flexibility

Disciplined working capital management and a net cash balance equip Penguin to navigate supply chain constraints and opportunistically secure memory inventory. The company’s share buyback activity and absence of near-term debt maturities provide additional financial flexibility to support innovation and strategic investment.

Key Considerations

Penguin’s Q2 marks a clear inflection in business model focus and capital allocation, with execution in memory and AI infrastructure now central to value creation. The following considerations frame the strategic context for investors:

Key Considerations:

  • AI Inference Adoption Accelerates: Customer demand is shifting from model training to inference, intensifying requirements for low-latency, memory-rich infrastructure.
  • Hyperscale and Legacy Wind-Down: Advanced computing faces near-term revenue headwinds from the exit of hyperscale and Penguin Edge, but underlying bookings and pipeline quality are improving.
  • Supply Chain as a Gating Factor: Memory supply constraints are now the primary limit on upside, with management using the balance sheet to pre-buy inventory where possible.
  • Margin Compression Risk: Product mix will pressure gross margins in the back half as memory and AI hardware volumes rise, partially offset by higher-margin solution sales down the road.
  • Strategic R&D Investment: Ongoing investment in clusterware software, CXL, and photonic memory will be critical to sustaining differentiation as the AI infrastructure market matures.

Risks

Penguin faces several material risks: supply chain volatility could limit memory sales and delay deployments, while rising memory costs and a shift toward lower-margin hardware threaten profitability in the near term. The transition away from hyperscale and Penguin Edge introduces revenue variability, and competitive dynamics—particularly as partners like Nvidia expand reference architectures—could intensify. Execution on new product launches and customer diversification will be pivotal to offsetting these headwinds.

Forward Outlook

For Q3 2026, Penguin guided to:

  • Continued strong memory segment growth, subject to supply availability
  • Lower advanced computing revenue as legacy business winds down

For full-year 2026, management raised guidance:

  • Net sales growth of 12% (prior 6%)
  • Non-GAAP diluted EPS of $2.15 (prior $2.00)
  • Memory segment growth of 65% to 75%
  • Advanced computing down 15% to 25% due to hyperscale and Penguin Edge exit
  • Gross margin outlook reduced by 1 point to 28%, reflecting mix shift

Management emphasized:

  • Visibility into memory demand is strong, but upside is gated by material availability
  • AI HPC bookings and pipeline diversification support confidence in long-term growth despite near-term revenue volatility

Takeaways

Penguin Solutions is rapidly transforming into an AI infrastructure and memory leader, with execution in memory and platform innovation driving renewed growth visibility.

  • AI Memory Demand Is Structural: The secular shift to inference workloads is creating a durable, less-cyclical foundation for Penguin’s memory business, with CXL and photonic memory set to enhance long-term margin potential.
  • Non-Hyperscale Diversification Is Accelerating: New enterprise, neocloud, and sovereign wins are reducing hyperscale risk and building a more resilient revenue base, though near-term variability remains as legacy contracts wind down.
  • Execution on Platform and Supply Chain Will Define Trajectory: Success hinges on delivering differentiated AI factory solutions, managing supply constraints, and sustaining customer engagement as the competitive landscape evolves.

Conclusion

Penguin’s Q2 marks a strategic turning point, with memory-driven AI infrastructure now the company’s core value lever. The business is executing well on customer diversification and product innovation, but near-term margin and revenue volatility will persist as legacy segments wind down. Sustained focus on AI platform differentiation and supply chain agility will be critical to maintaining momentum.

Industry Read-Through

Penguin’s results reinforce that AI infrastructure demand is rapidly evolving from training-centric to inference-centric workloads, with memory architecture and integration emerging as key differentiators. The surge in CXL adoption and early investments in photonic memory signal a broader industry pivot toward heterogeneous, memory-rich systems. Suppliers and integrators with deep expertise in both compute and memory stand to benefit disproportionately as AI factories proliferate across enterprise and sovereign domains. However, the transition away from hyperscale dependency and the need for supply chain agility are sector-wide challenges. Competitors and partners alike must adapt to a landscape where customer requirements are shifting toward full-stack, production-grade AI deployments.