Taken from the ISC HPC Blog, by Peter Ffoulkes
This month, the annual HPC Advisory Council meeting took place in Switzerland, stirring many discussions about the future of big data, and the available and emerging technologies slated to solve the inundation of data on enterprises.
“Going back to fundamentals, HPC is frequently defined as either compute intensive or data intensive computing or both. Welcome to today’s hottest commercial computing workload, “Total Data” and business analytics. As described by 451 Research, “Total Data” involves processing any data that might be applicable to the query at hand, whether that data is structured or unstructured, and whether it resides in the data warehouse, or a distributed Hadoop file system, or archived systems, or any operational data source – SQL or NoSQL – and whether it is on-premises or in the cloud.”
According to Ffoulkes, the answer to the total data question will remain in HPC. This week, many of us attended the OpenFabrics Alliance User and Developer Workshop and discussed these same topics: enterprise data processing needs, cloud computing, big data, and while the event has ended, I hope the discussions continue as we look to the future of big data.
In the meantime, be sure to check out Peter’s thoughts in his full blog post.