Key Moments
- Mizuho called Samsung its most attractive long idea for next year, suggesting the stock “could be a total home run.”
- Additionally, the firm highlighted a potential shift in AI hardware toward LPUs and broader use of GDDR, LPDDR, and SRAM, which could expand the AI memory market.
- Furthermore, Mizuho cited Samsung’s leadership in next-generation GDDR7 and its exposure to enterprise QLC SSDs as key advantages heading into 2026.
Mizuho Flags Samsung as Top Memory Play Into 2026
In a note published Monday, Mizuho identified Samsung as its preferred memory stock for 2026. The firm argued that Samsung is particularly well positioned as AI hardware requirements evolve.
Analyst Jordan Klein described Samsung as “the most compelling long idea” and added that the stock “could be a total home run.” Overall, the view is based on a broader thesis: changing AI workloads will reshape demand for different memory products.
Groq LPU Report Highlights Shift in AI Inference Hardware
Klein also referenced a report from Digitimes detailing the performance characteristics of Groq’s language processing unit (LPU) and its potential impact on high-bandwidth memory suppliers.
According to the article, “LPUs significantly reduce data processing latency, enabling inference speeds faster than human conversational pace.” Moreover, the report stated that Groq claims its LPU can execute large-language-model tasks “10 times faster than current solutions while consuming only one-tenth the energy.”
Broader AI Memory Mix: From HBM to GDDR, LPDDR, and SRAM
Mizuho argued that advances like Groq’s LPU may change how memory is deployed across AI systems. While the firm expects training workloads to remain heavily dependent on high-bandwidth memory, it sees a more diversified picture for inference and on-device AI.
For example, training will continue to lean on HBM. However, inference devices and on-device AI gadgets will increasingly use GDDR, LPDDR, and SRAM. Consequently, this shift could enlarge the overall addressable market for AI-related memory.
LPU Architecture Renews Focus on High-Speed Memory Suppliers
Klein emphasized that Groq’s LPU architecture “enhances computing speed by leveraging high-speed memories like SRAM.” As a result, certain component vendors are gaining attention. The design focus highlighted suppliers such as Cypress and Renesas, which could benefit from increased demand for high-speed memory solutions.
| Memory Type | Primary Use Case (per Mizuho note) | Key Comment |
|---|---|---|
| HBM | AI training workloads | Expected to remain core for training |
| GDDR | Inference devices and AI gadgets | Nvidia Rubin CPX reportedly shifting to GDDR |
| LPDDR | On-device AI and inference | Gaining importance in AI gadgets |
| SRAM | High-speed compute architectures | Used in LPU design to boost speed |
Nvidia Rubin CPX and Samsung’s GDDR7 Leadership
Moreover, the Mizuho note drew attention to Nvidia’s Rubin CPX chip roadmap. According to the firm, the product will “forego costly HBM in favor of more affordable and readily available GDDR.” This change could alter demand dynamics across memory suppliers.
In addition, Mizuho said Samsung holds a strong position in GDDR7. If Rubin CPX and similar designs favor GDDR over HBM, Samsung’s role as a leading supplier could become increasingly important.
Generative AI Storage Needs Support Bullish Memory Outlook
Finally, Mizuho linked its constructive stance on Samsung to storage trends tied to generative AI. As AI applications expand into video and high-resolution imagery, Klein highlighted increasing demand for enterprise-grade QLC solid-state drives.
Therefore, the firm believes Samsung is positioned to benefit across several segments of the memory and storage market. Mizuho expects memory pricing to strengthen into 2026 and thinks Samsung’s breadth across DRAM, GDDR, and QLC SSDs could translate into meaningful upside under its AI-driven scenario.





