Why Data Pros Have a Need for Speed

By Dennis McCafferty  |  Posted 06-29-2016 Email

As a means of achieving what's called "fast data/big data," a majority of organizations are deploying tech tools which enable near real-time data pipelines, according to a recent survey from OpsClarity. The resulting report, titled "2016 State of Fast Data and Streaming Applications," indicates real-time standards require the processing of data within five minutes of the ingesting of the data. With this, survey respondents say they're better positioned to power customer-facing apps while delivering analytics to optimize internal business processes. In seeking better—and faster—ways to improve data management, companies are planning to boost data streaming capabilities within the next year while hopefully overcoming obstacles in the form of frequent code changes and an inability to effectively respond to data surges. "The digital world is moving faster than ever—with massive volumes of data, more devices, users and applications," according to the report. "To keep up with the pace, large volumes of dynamically changing data must be processed at high velocity. Companies cannot afford to wait for analytics that only kick in at midnight to make critical decisions that impact their business. Real-time insights are imperative. Streaming or fast data applications are driving a new wave of data revolutions." More than 4,000 developers, data architects, DevOps team members, senior IT managers and other IT pros took part in the research.

Dennis McCafferty is a freelance writer for Baseline Magazine.


Submit a Comment

Loading Comments...