Bizarre Data versus Big Data - A Normative Shift

Aug 30, 2013

We can now store and use the information about large sizes in our daily life, at home or work, from text documents to multimedia. Dealing with such large data requires a separate set of technologies where traditional database systems fail to serve adequately. Such data are now called 'big data' and different attributes are employed to define such data technically. But is it really required to redefine the content and meaning of 'data'? Does it still not rally around its normative aspect?

A report says Amazon S3 objects have increased from 2,00,000 to 1 trillion in a period of mere six years. Oh, is this big enough for data to be called 'big'; or these 1 trillion (please note that it's all in the short scale) carry a size of data that will be called so? The 'volume'; we now think in septillion, 10 multiplied twenty-three times to itself.

When the USA government looked for aggregating personal data of world citizens, George Orwell's fictional world seemed to be a reality, even though more than a couple of decades later. But then these data go really big. It is a different matter that British arrangement preceded in 1857 when the reading tables were placed along radii of the circular Museum Reading Room; even though Americans conceived of a new world of personal freedom of thought just four decades later by arranging those desks in concentric circles in the Library of Congress. History has come now in full circle; at a big scale!

Well, back to our discussion about the 'bigness' of data. Techies will teach about different attributes of data to call it big conclusively. But is that relevant? Well, then include 'relevancy' as one attribute of these data. You may not believe the government anymore; but they believe in sampling from your data, the 'fidelity'.

The list of attributes goes on with the 'big' data. Well, we tend to make it truly big. We find access to data to be one of the criteria, the 'velocity'. It is about how quickly we can store data and retrieve data. In the same way, how quickly our telephone calls are scanned and then retrieved back for analysis - a perpetual search for a sign of vulnerability. Well, you cannot add 'perpetuity' and 'vulnerability' as two more attributes. Obey the geeks, it is about 'variety', the data can be telephone calls or emails, or any other means of communication that can travel through fibers at the speed of light.

Well, geeks can go fanciful like the 20th-century philosophers when they added the attribute of 'pleasure' to the long list around the concept of beauty that was pristine and pure then - a normative shift on the serious count. Not sure if 21st-century thinkers would think big is beautiful! But then, redefinition has been made. The meaning and content of 'data' has gone a significant transformation.

While we still agree with the changed perspective, we still want to apply it to its rightful usage. Why should we be worried about these big things if the big crunch of data is not possible? Can the government afford if the next terrorist attack happens in spite of these all-pervasive data being archived, yet not suggesting the catastrophe? Certainly, we cannot take refuge in the normativism of data if those do not turn into useful information. We cannot be just happy to find a running graph on the screen of our handheld device and claim that we visualize big data anywhere and anytime.

Cloud Computing by Mr Ashwini Rath

To have a better understanding of Cloud Computing and to understand how to derive maximum benefit of the technology you can refer to an interesting book

Give your feedback