You are viewing ARCHIVED CONTENT released online between 1 April 2010 and 24 August 2018 or content that has been selectively archived and is no longer active. Content in this archive is NOT UPDATED, and links may not function.
So just how ‘big’ is the world’s ‘big data’ and why should companies care? As discussed in my last column, “What makes today’s data ‘Big’?”, we have experienced an explosion of new data and data sources over the last two decades. This is data that companies are analyzing in pursuit of business insight and competitive advantage. Those who do this well are learning that the sheer size and complexity of today’s data can be a struggle to manage using traditional hardware and software.
We use the term ‘Big Data’ to indicate that today’s data is much larger and more complex than ever before. Doug Laney, of the IT research firm Gartner, Inc., is credited with creating what is perhaps the most widely used framework for describing how ‘big’ our data has become. The framework uses three dimensions: Volume (a measure of scale), velocity (a measure of speed), and variety (a measure of form). In this column, I explore the volume dimension and the challenges companies face as a result.
International Data Corporation (IDC), a leading provider of market intelligence for information technology markets, predicts the world’s data will grow to a staggering 44 zettabytes by the year 2020. Huh, 44 what? Herein lies one of the difficulties concerning big data. Data has become so large that the words we use to describe its size are not part of our everyday vocabulary. This leads to confusion.
If you have information or offering requests that you would like to ask us about, please let us know, and we will make our response to you a priority.
ComplexDiscovery OÜ is an independent digital publication and research organization based in Tallinn, Estonia. ComplexDiscovery covers cybersecurity, data privacy, regulatory compliance, and eDiscovery, with reporting that connects legal and business technology developments—including high-growth startup trends—to international business, policy, and global security dynamics. Focusing on technology and risk issues shaped by cross-border regulation and geopolitical complexity, ComplexDiscovery delivers editorial coverage, original analysis, and curated briefings for a global audience of legal, compliance, security, and technology professionals. Learn more at ComplexDiscovery.com.
Generative Artificial Intelligence and Large Language Model Use
ComplexDiscovery OÜ recognizes the value of GAI and LLM tools in streamlining content creation processes and enhancing the overall quality of its research, writing, and editing efforts. To this end, ComplexDiscovery OÜ regularly employs GAI tools, including ChatGPT, Claude, Gemini, Grammarly, Midjourney, and Perplexity, to assist, augment, and accelerate the development and publication of both new and revised content in posts and pages published (initiated in late 2022).
ComplexDiscovery also provides a ChatGPT-powered AI article assistant for its users. This feature leverages LLM capabilities to generate relevant and valuable insights related to specific page and post content published on ComplexDiscovery.com. By offering this AI-driven service, ComplexDiscovery OÜ aims to create a more interactive and engaging experience for its users, while highlighting the importance of responsible and ethical use of GAI and LLM technologies.