Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    data analytics in ecommerce
    Analytics Technology Drives Conversions for Your eCommerce Site
    5 Min Read
    CRM Analytics
    CRM Analytics Helps Content Creators Develop an Edge in a Saturated Market
    5 Min Read
    data analytics and commerce media
    Leveraging Commerce Media & Data Analytics in Ecommerce
    8 Min Read
    big data in healthcare
    Leveraging Big Data and Analytics to Enhance Patient-Centered Care
    5 Min Read
    instagram visibility
    Data Analytics Plays a Key Role in Improving Instagram Visibility
    7 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: It’s Time for a New Definition of Big Data
Share
Notification Show More
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Culture/Leadership > It’s Time for a New Definition of Big Data
AnalyticsCollaborative DataCommentaryCulture/Leadership

It’s Time for a New Definition of Big Data

MIKE20
Last updated: March 19, 2012 7:05 am
MIKE20
6 Min Read
SHARE

Two words seemingly on every technologist’s lips are “big data”.  The Wikipedia definition for big data is: “In information technology, big data consists of datasets that grow so large that they become awkward to work with using on-hand database management tools”.  This approach to describing the term constrains the discussion of big data to scale and fails to realise the key difference between regular data and big data.  The blog posts and books which cover the topic seem to conver

Contents
Big data that is very smallLarge datasets that aren’t bigDefining big data

Two words seemingly on every technologist’s lips are “big data”.  The Wikipedia definition for big data is: “In information technology, big data consists of datasets that grow so large that they become awkward to work with using on-hand database management tools”.  This approach to describing the term constrains the discussion of big data to scale and fails to realise the key difference between regular data and big data.  The blog posts and books which cover the topic seem to converge on the same approach to defining big data and describe the challenges with extracting value from this resource in terms of its size.

Big data can really be very small and not all large datasets are big!  It’s time to find a new definition for big data.

Big data that is very small

Modern machines such as cars, trains, power stations and planes all have increasing numbers of sensors constantly collecting masses of data.  It is common to talk of having thousands or even hundreds of thousands of sensors all collecting information about the performance and activities of a machine.

More Read

How To Challenge Your Assumptions

Visualizing Networks in R: Arc Diagrams and Hive Plots
Blockchain Will Unblock A Data Problem In Healthcare
3 Ways Predictive Analytics and Big Data Can Help Forex Brokers
Can Better Customer Experience Be Found in Cloud?

Imagine a plane on a regular one hour flight with a hundred thousand sensors covering everything from the speed of air over every part of the airframe through to the amount of carbon dioxide in each section of the cabin.  Each sensor is effectively an independent device with its own physical characteristics.  The real interest is usually in combinations of sensor readings (such as carbon dioxide combined with cabin temperature and the speed of air combined with air pressure).  With so many sensors the combinations are incredibly complex and vary with the error tolerance and characteristics of individual devices.

The data streaming from a hundred thousand sensors on an aircraft is big data.  However the size of the dataset is not as large as might be expected.  Even a hundred thousand sensors, each producing an eight byte reading every second would produce less than 3GB of data in an hour of flying (100,000 sensors x 60 minutes x 60 seconds x 8 bytes).  This amount of data would fit comfortably on a modest memory stick!

Large datasets that aren’t big

We are increasingly seeing systems that generate very large quantities of very simple data.  For instance, media streaming is generating very large volumes with increasing amounts of structured metadata.  Similarly, telecommunications companies have to track vast volumes of calls and internet connections.

Even if these two activities are combined, and petabytes of data is produced, the content is extremely structured.  As search engines, such as Google, and relational databases have shown, datasets can be parsed extremely quickly if the content is well structured.  Even though this data is large, it isn’t “big” in the same way as the data coming from the machine sensors in the earlier example.

Defining big data

If size isn’t what matters then what makes big data big?  The answer is in the number of independent data sources, each with the potential to interact.  Big data doesn’t lend itself well to being tamed by standard data management techniques simply because of its inconsistent and unpredictable combinations.

Another attribute of big data is its tendency to be hard to delete making privacy a common concern.  Imagine trying to purge all of the data associated with an individual car driver from toll road data.  The sensors counting the number of cars would no longer balance with the individual billing records which, in turn, wouldn’t match payments received by the company.

Perhaps a good definition of big data is to describe “big” in terms of the number of useful permutations of sources making useful querying difficult (like the sensors in an aircraft) and complex interrelationships making purging difficult (as in the toll road example).

Big then refers to big complexity rather than big volume.  Of course, valuable and complex datasets of this sort naturally tend to grow rapidly and so big data quickly becomes truly massive.

TAGGED:big data
Share This Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

trusted data management
The Future of Trusted Data Management: Striking a Balance between AI and Human Collaboration
Artificial Intelligence Big Data Data Management
data analytics in ecommerce
Analytics Technology Drives Conversions for Your eCommerce Site
Analytics Exclusive
data grids in big data apps
Best Practices for Integrating Data Grids into Data-Intensive Apps
Big Data Exclusive
AI helps create discord server bots
AI-Driven Discord Bots Can Track Server Stats
Artificial Intelligence Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

demographics big data in marketing
Big Data

The Role of Data in Understanding Demographics for Effective Marketing

7 Min Read
Julia Language
Big Data

Could the Julia Language Fill an Untapped Void for Big Data Programmers?

6 Min Read

Big Data, Data Warehousing and the Strata Conference

3 Min Read
big data analytics for smart devices
AnalyticsBig Data

Big Data, Big Difference: Building Smarter Devices with Data Analytics

7 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai chatbot
The Art of Conversation: Enhancing Chatbots with Advanced AI Prompts
Chatbots
ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-24 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?