AI deployment uses multiple storage layers, and each one has different requirements, says Dell’Oro’s Fung. For storing massive amounts of unstructured, raw data, cold storage on HDDs makes more sense, he says. SSDs make sense for warm storage, such as for pre-processing data and for post-training and inference. “There’s a place for each type of storage,” he says.
Planning ahead
According to Constellation’s Mehta, data center managers and other storage buyers should prepare by treating SSD procurement like they do GPUs. “Multi-source, lock in lanes early, and engineer to standards so vendor swaps don’t break your data path.” He recommends qualifying at least two vendors for both QLC and TLC and starting early.
TrendForce’s Ao agrees. “It is better to build inventory now,” he says. “It is difficult to lock-in long term deals with suppliers now due to tight supply in 2026.”
Based on suppliers’ availability, Kioxia, SanDisk, and Micron are in the best position to support 128-terabyte QLC enterprise SSD solutions, Ao says. “But in the longer term, some module houses may be able to provide similar solutions at a lower cost,” Ao adds. “We are seeing more module houses, such as Phison and Pure Storage, supporting these solutions.”
And it’s not just SSD for fast storage and HDD for slow storage. Memory solutions are becoming more complex in the AI era, says Ao. “For enterprise players with smaller-scale business models, it is important to keep an eye on Z-NAND and XL-Flash for AI inference demand,” he says.
These are memory technologies that sit somewhere between the SSDs and the RAM working memory. “These solutions will be more cost-effective compared to HBM or even HBF [high bandwidth flash],” he says.