TMCnet News

Mellanox Optimizes Flash Storage Access, Enabling Faster Time-to-Decision While Lowering Cost
[August 11, 2015]

Mellanox Optimizes Flash Storage Access, Enabling Faster Time-to-Decision While Lowering Cost


Flash Memory Summit - Mellanox (News - Alert)® Technologies, Ltd. (NASDAQ:MLNX), a leading supplier of high-end cloud and storage networking solutions, today announced a set of storage demonstrations and performance benchmark results which showcase how customers can accelerate applications, analyze data faster, and lower expenditures using Remote Direct Memory Access (RDMA) technology to share solid-state storage.

"The higher performance of solid state storage demands higher performance from the interconnect technology in order for businesses to realize the true power of their data centers and the information they hold," stated Kevin Deierling, vice president of marketing at Mellanox Technologies. "Customers have been clear that when they invest in flash storage, they want to share it across multiple servers using the most efficient networking available to maximize their return on investment. Nearly everyone who wants to demonstrate fast fabric access to flash is using Mellanox products, and we are excited to show these solutions with our partners."

During the Flash Memory Summit (August 11-13, 2015), Mellanox and key partners will demonstrate the following solutions that illustrate how high-speed networking empowers solid state storage technologies:

  • Apeiron will be demonstrating their Apeiron Data Fabric™ (booth #819);
  • Phase-change memory (PCM) access over EDR 100Gb/s InfiniBand with access latency of less than 2 microseconds with HGST (booth #647);
  • NMX-series Server SAN and NVMe over Fabrics powered by Mellanox to achieve millions of IOPS with Mangstor (booth #649);
  • NVMe Over Fabrics network access to NVMe devices with industry leading high performance and low latency, using both RDMA Over Converged Ethernet (RoCE) and InfiniBand demonstrated with Memblaze (booth #319), Micron, and PMC-Sierra (News - Alert) (booth #213);
  • iSCSI over RDMA (iSER) access to a flash array with extremely high throughput and IOPs, demonstrated with Micron, Saratoga Speed (booth #517), and others;
  • NetApp EF560 flash array with iSER capability using Mellanox FDR 56Gb/s InfiniBand technology (booth #511);
  • Next generation NVRAM replication over a RDMA network exhibiting very low latency with PMC-Sierra (booth #213);
  • Next generation NVMe flas technology with key applications (Samsung (News - Alert) Semiconductor booth #307);
  • InfiniFlash array with unprecedented high IOPS Ceph performance using Mellanox 40GbE networking with SanDisk (News - Alert) (booth #207).



Mellanox will showcase new technologies that expedite access to solid state storage, including the new Spectrum (News - Alert) 10/25/40/50/100Gb/s Ethernet switch, ConnectX-4 100Gb/s Ethernet and InfiniBand adapter, and ConnectX-4 Lx 10/25/40/50Gb/s Ethernet adapter. These solutions can be seen in the Mellanox booth #817. More details about the demonstrations and technologies can be found in the Mellanox blogs at http://www.mellanox.com/blog/.

Supporting Resources:


About Mellanox

Mellanox Technologies is a leading supplier of end-to-end InfiniBand and Ethernet interconnect solutions and services for servers and storage. Mellanox interconnect solutions increase data center efficiency by providing the highest throughput and lowest latency, delivering data faster to applications and unlocking system performance capability. Mellanox offers a choice of fast interconnect products: adapters, switches, software, cables and silicon that accelerate application runtime and maximize business results for a wide range of markets including high-performance computing, enterprise data centers, Web 2.0, cloud, storage and financial services. More information is available at www.mellanox.com.

Mellanox, ConnectX and SwitchX are registered trademarks of Mellanox Technologies, Ltd. All other trademarks are property of their respective owners.


[ Back To TMCnet.com's Homepage ]