Not enough is said about front- or back-end InfiniBand storage, but fear not! Interesting article just came out from Dave Raffo at SearchStorage.com that I think is worth spreading through the InfiniBand community. I have a quick summary below but be sure to also check out the full article: “Health care system rolls its own data storage ‘cloud’ for researchers.”

Partners HealthCare, a non-profit organization founded in 1994 by Brigham and Women’s Hospital and Massachusetts General Hospital, is an integrated health care system that offers patients a continuum of coordinated high-quality care.
Over the past few years, ever-increasing advances in the resolution and accuracy of medical devices and instrumentation technologies have led to an explosion of data in biomedical research. Partners HealthCare recognized early on that a cloud-based research compute and storage infrastructure could be a compelling alternative for their researchers. Not only would it enable them to distribute costs and provide storage services on demand, but it would save on IT management time that was spent fixing all the independent research computers distributed across the Partners HealthCare network.

In an effort to address their unique needs, Partners HealthCare developed their own specially designed storage network. Initially, they chose Ethernet as the transport technology for their storage system. As demand grew, the solution began hitting significant performance bottlenecks – particularly during the read/write of hundreds of thousands of small files. The issue was found to lie with the interconnect -Ethernet created problems due to its high natural latency. In order to provide a scalable, low latency solution, Partners Healthcare turned to InfiniBand. With InfiniBand on the storage back end, Partners HealthCare experienced roughly two orders of magnitude faster read times.

“One user had over 1,000 files, but only took up 100 gigs or so,” said Brent Richter, corporate manager for enterprise research infrastructure and services, Partners HealthCare System. “Doing that with Ethernet would take about 40 minutes just to list that directory. With InfiniBand, we reduced that to about a minute.”

Also, Partners HealthCare chose InfiniBand over 10-Gigabit Ethernet because InfiniBand is a lower latency protocol. “InfiniBand was price competitive and has lower latency than 10-Gig Ethernet,” said Richter. Richter mentioned the final price tag came to about $1 per gigabyte.

By integrating InfiniBand into the storage solution, Partners HealthCare was able to reduce latency close to zero and increase its performance, providing their customers with faster response and higher capacity.

Great to see end-user cases like this come out! If you are a member of the IBTA and would like to share one of your InfiniBand end user deployments, please contact us at press@infinibandta.org.

Till next time,
brian_2
Brian Sparks
IBTA Marketing Working Group Co-Chair