'Big Data' Isn't Just Quantity, It's Quality: Gartner
How to Increase the Reliability of Your IT Infrastructure Using Predictive Analytics REGISTER >
The IT term "big data" is currently getting close to the same hot-button treatment that cloud computing did five years ago, and for good reason.
The continued increasing amount of business data created both by humans and by machines is having major effects on IT systems, which are struggling to store, tier and make easily accessible all that information.
But it's not just about high volumes of data. It's also about the increasing number of data types that are coming into systems, which need to be handled differently from simple email, data logs and credit card records. It's also about the speed at which all this data is moving from endpoints into storage.
Along these lines, IT researcher Gartner came out June 27 with a report that claims many IT decision makers are attempting to manage big-data problems by focusing exclusively on the high volumes of information to the exclusion of the other facets of information management, which could lead to difficulties later.
"Today's information management disciplines and technologies are simply not up to the task of handling all these dynamics," said Gartner Research Vice President Mark Beyer. "Information managers must fundamentally rethink their approach to data by planning for all the dimensions of information management.
"The business' demand for access to the vast resources of big data gives information managers an opportunity to alter the way the enterprise uses information. IT leaders must educate their business counterparts on the challenges while ensuring some degree of control and coordination so that the big-data opportunity doesn't become big-data chaos, which may raise compliance risks, increase costs and create yet more silos," he said.