In the ongoing quest for faster access to data, Diablo Technologies has taken what could be a significant next step.
Diablo’s Memory Channel Storage (MCS) architecture, expected to show up in servers shipping later this year, allows flash storage components to plug into the super-fast channel now used to connect CPUs with memory. That will slash data-access delays even more than current flash caching products that use the PCI Express bus, according to Kevin Wagner, Diablo’s vice president of marketing.
The speed gains could be dramatic, according to Diablo, helping to give applications such as databases, big data analytics and virtual desktops much faster access to the data they need most. Diablo estimates that MCS can reduce latencies by more than 85 percent compared with PCI Express SSDs (solid-state disks). Alternatively, the flash components could be used as memory, making it affordable to equip servers terabytes of memory, Wagner said.
Other than on-chip cache, the memory channel is the fastest route to a CPU, Wagner said. Not only do bits fly faster over this link, there are also no bottlenecks under heavy use. The connection is designed to be used by many DIMMs (dual in-line memory modules) in parallel, so each component doesn’t have to relinquish the bus for another one to use it. That saves time, as well as CPU cycles that would otherwise be used managing the bus, Wagner said.
The parallel design of the memory bus also lets system makers scale up the amount of flash in a server without worrying about diminishing returns, he said. A second MCS flash card will truly double performance, where an added PCIe SSD could not, Wagner said.
Diablo, which has been selling memory controllers for about 10 years, has figured out a way to use the standard DDR-3 interface and protocols to connect flash instead of RAM to a server’s CPU. Flash is far less expensive than RAM, but also more compact. The MCS components, which come in 200GB and 400GB sizes, will fit into standard DIMM slots that typically accommodate just 32GB or so of memory. The only adaptation manufacturers will need to make is adding a few lines of code to the BIOS, Wagner said.
Enterprises are more likely to use MCS as high-capacity memory than as low-latency storage, said analyst Jim Handy of Objective Analysis.
“Having more RAM is something that a lot of people are going to get very excited about,” Handy said. His user surveys show most IT departments automatically get as much RAM as they can for their servers, because memory is where they can get the fastest access to data, Handy said.
“Basically, you’d like everything to be in the RAM,” Handy said. Virtualized data centers, where many servers need to share a large set of data, need a shared store of data. But in other applications, especially with databases and online transaction processing, storage is just a cheaper and more plentiful — but slower — alternative to memory. “Everything that’s on the storage is there just because it can’t fit on the RAM,” he said.