Databases Are the Weak Point in Big Data Projects

 
 
By Karen A. Frenkel  |  Posted 06-02-2015 Email
 
 
 
 
 
 
 
 
 
  • Previous
    Databases Are the Weak Point in Big Data Projects
    Next

    Databases Are the Weak Point in Big Data Projects

    By Karen A. Frenkel
  • Previous
    Math Is Out of Date
    Next

    Math Is Out of Date

    "Modern" database algorithms are still based on 1970s technology, creating a need for updated math to deal with the scale and performance of big data.
  • Previous
    MySQL Architecture Old
    Next

    MySQL Architecture Old

    MySQL was built in 1995 when the fastest Intel processor was the Pentium Pro. MySQL needs an updated architecture to take advantage of modern hardware.
  • Previous
    Images Depend on Configuration
    Next

    Images Depend on Configuration

    Database deployment images are configuration-dependent, causing an explosion of packages. Businesses need databases that observe and adapt to any physical, virtual or cloud instances without installation packages for each platform.
  • Previous
    Inflexible Systems
    Next

    Inflexible Systems

    DBAs are on a time-consuming tuning treadmill due to inflexible systems, creating a need for databases with online self-optimization for the ebb and flow of data workloads.
  • Previous
    Databases Not Concurrent
    Next

    Databases Not Concurrent

    Databases typically only handle a single type of workload. Instead, enable the database to concurrently host multiple workloads (ingest, transactions, analytics) without destroying performance.
  • Previous
    Multiple Algorithms Needed
    Next

    Multiple Algorithms Needed

    Fixed algorithm choice only allows a single set of behaviors regardless of the workload. Today's databases must offer multiple algorithms that can be switched on-the-fly based on workload requirements.
  • Previous
    Compression Needed
    Next

    Compression Needed

    Compression is needed to maximize storage capacity and minimize costs, but current architectures utilize IO, sacrificing performance and scale. Architectures that offload compression overhead are needed to separate CPU threads so the impact on performance is minimal.
  • Previous
    No Predictable Performance Scale
    Next

    No Predictable Performance Scale

    Due to algorithm limitations, databases eventually run off the "performance cliff" despite the best possible configurations. It's time databases show predictable performance at scale and allow for orderly capacity planning.
  • Previous
    Inadequate Scaling
    Next

    Inadequate Scaling

    Database scaling hits a brick wall. Make scaling to billions normal with state-of-the-art algorithms that reduce IOPS through intelligent caching, thereby eliminating unneeded reads and writes.
  • Previous
    Unpredictable Data Performance
    Next

    Unpredictable Data Performance

    Multiple issues coalescing in a single infrastructure cause unpredictable data performance. With a more streamlined infrastructure that eliminates production, backup and geo-location silos, companies can avoid platform fragmentation.
  • Previous
    Require ETL Process
    Next

    Require ETL Process

    ETL process slows business analytics and reduces the depth of insights due to data size versus processing time trade-offs. Next-generation databases need to utilize the same production platform and data set in-place for analytics without ETLs.
  • Previous
    Antiquated Data Processing Methods
    Next

    Antiquated Data Processing Methods

    Databases use antiquated techniques to process data, creating fragile environments. Today's businesses need databases that separate memory and disk structures from each other to ensure database integrity.
 

Big data is at the core of today's businesses and is recognized as the key to helping companies unlock never-before-seen opportunities and competitive advantages. But going from data points to competitive advantage can be a challenge filled with many variables. Chad Jones, Chief Strategy Officer at Deep Information Sciences, wants to know where are all of these tremendous insights we've been hearing so much about? "The reality is that today's databases, built on math that is more than 30 years old, don't have the scale, speed and performance capabilities to handle big data," he said. "It's time the industry accelerates legacy practices with an adaptive alternative capable of machine learning. With this new underlying structure, databases' most critical functions, from high-speed data ingest to orderly capacity planning, will finally perform as they should, so businesses can identify the opportunities and business intelligence that big data is capable of delivering." Here are his tips for fixing what's wrong with databases.

 
 
 
 
 
Karen A. Frenkel writes about technology and innovation and lives in New York City.

 
 
 
 
 
 

Submit a Comment

Loading Comments...