Extract from article by Dr. Rick Brattin
So just how ‘big’ is the world’s ‘big data’ and why should companies care? As discussed in my last column, “What makes today’s data ‘Big’?”, we have experienced an explosion of new data and data sources over the last two decades. This is data that companies are analyzing in pursuit of business insight and competitive advantage. Those who do this well are learning that the sheer size and complexity of today’s data can be a struggle to manage using traditional hardware and software.
We use the term ‘Big Data’ to indicate that today’s data is much larger and more complex than ever before. Doug Laney, of the IT research firm Gartner, Inc., is credited with creating what is perhaps the most widely used framework for describing how ‘big’ our data has become. The framework uses three dimensions: Volume (a measure of scale), velocity (a measure of speed), and variety (a measure of form). In this column, I explore the volume dimension and the challenges companies face as a result.
International Data Corporation (IDC), a leading provider of market intelligence for information technology markets, predicts the world’s data will grow to a staggering 44 zettabytes by the year 2020. Huh, 44 what? Herein lies one of the difficulties concerning big data. Data has become so large that the words we use to describe its size are not part of our everyday vocabulary. This leads to confusion.