It’s increasingly evident that data is the capital for the digital age. It unlocks answers to questions, provides insights into processes, and helps organizations connect to customers, business partners and others in new and sometimes profound ways.
Although most enterprise leaders recognize how critical it is to have a big data strategy in place, it isn’t always obvious how quickly the landscape is advancing and evolving. “It is a very different environment than it was only a few years ago,” says Scott Schlesinger, chief analytics officer at IT consulting firm Cognizant Digital Business.
For one thing, the three “Vs” of big data—volume, variety and velocity—now include veracity and value. Yet, regardless of how one chooses to define the concept, a basic fact is clear: Big data is getting even bigger due to the internet of things (IoT), the maturation of clouds, advancements in IT frameworks and the introduction of big data as a service.
“Big players such as Amazon, Google and Microsoft are scaling up big data and pushing the boundaries,” says Gourtham Belliappa, vice president for data and analytics at consulting firm Capgemini.
How can CIOs adopt a successful big data framework and maximize results? How is it possible to transform billions of data points into results? And how can organizations move beyond the basics and adopt a best practice approach?
“The goal is to move beyond a tactical approach and build a strategic framework,” advises Cognizant’s Schlesinger. “Big data initiatives must solve real-world business problems. This requires a fundamental rethinking of who owns data within an organization and how to leverage it for maximum value and returns.”
Driving Massive Changes in the Big Data Space
A couple of key factors are driving massive changes in the big data space. First, the IoT introduces an infinite array of new data points—and opportunities. Sensors, machine data, smartphones, POS information, social streams, and other forms of structured and unstructured data offer insight into events that wouldn’t have been possible to track or understand only a few years ago.
Second, the maturation of clouds has introduced instant-on and highly automated, near real-time data frameworks. As Capgemini’s Belliappa puts it: “Big data as a service introduces an environment where configuration, management and maintenance issues largely disappear.”
Today, the software required for advanced data analytics is often inexpensive or even free, he says. “These containers are available from providers at a very low cost,” Belliappa adds, “and the price is continuing to drop. Not only do you eliminate software licensing, you’re [also] able to drive down the total cost of ownership from both a hardware and software perspective.
“Ultimately, you can put the money into development instead of operations. You can move toward a faster and more agile framework where it’s possible to put datasets to maximum use and move in new and different directions quickly.”
One advantage of adopting a cloud framework for big data is the ability to insulate data manipulation logic from the actual underlying process. Extract, Transform, Load (ETL) abstraction tools connect data sources without regard to the service provider. This allows an organization to swap in and out of service providers and still apply the same underlying business logic.