Most widely accepted definition of big data today is “3V“, where the V’s represent Velocity, Variety and Volume. This definition was first proposed by Gartner during the transition between Web 1.0 to Web 2.0. It is a generic definition where the behavior of data addresses the question of “how” within a specified time frame.
Volume: How much data is generated?
Velocity: How often the data changes?
Variety: How diverse is the total data set generate?
Whenever the term “Big Data” is encountered, an immediate connection is made with internet and social media. Twitter and Facebook are easy examples and are poster boys to show case all three factors that influence “Big Data”.
Large data sets came in to existence even before persistent data storage became reality. Remember punch cards to collect census information? One common denominator before the turn of this century is that all large data sets did not conform the “3V” definition because they predominantly had only one or two characteristics out of three. Computing architecture, algorithms and several other advancements related to evolution of computing in general factored only what was required to solve the problem at hand. This resulted in specialized computers, that were developed for niche applications and were good at processing large data sets and crunching numbers. Super computers may be really really fast, but they were designed for specialized applications, which is precisely the reason why you find super computers in research labs or NASA or military.
Big Data in today’s context is distributed in nature. With internet as backbone, data can be generated by either “human actions” or “programmed devices” and sent anywhere across the globe. Human based events are random in nature with high degree of unpredictability whereas devices are predictable in recording and sharing data.
In reality, this is a problem to be solved and there are many approaches that can be adopted. Today, Big Data represents a set of tools, technologies and methodology that can be adopted by anyone that will facilitate in collecting, processing and analyzing data that is highly dynamic in nature, i.e. exhibit the 3V characteristics.
What has “Big Data” to do in Business Intelligence arena?
Businesses invest in a technology or platform only when an economic value associated with such an investment can be realized. Investments are predominantly made into the human angle of research and analytics.
First and foremost application of Big Data is focused on analyzing the various touch points that a business organization has with customers. Some examples of touch points are brand analysis, customer feedback or reviews, trends in behavior etc. Analysis in this area helps in fine tuning the sales, marketing, product design and delivery. Major advantage of adopting big data in this area is the extremely short feedback cycle, as a matter of fact it is instant.
Supply chain is another area where big data is making inroads, which is primarily driven due to the previous point.
There is still considerable interest in adopting big data methodologies for mainstream Business Intelligence needs, but this is an ongoing effort. Internet of Things is now mooted as the next leap in Big Data and Automating Analytics (or) Pervasive Analytics is looking close to reality.