The Biggest Big Data

Standard

At the risk of sounding like a technical dinosaur, I still have neatly filed away some 3.5” floppy discs. Their capacity a tiny 1.44Mb. I remember well the days that I used to pop them in and out of my Mac and other devices at the time thinking that I would never need any more storage than that. Of course I was wrong, by a long way. Megabytes, became Gigabytes, Gigabytes have become Terabytes and today, whilst listening to Pete Rose from HP talk at the Malvern Festival Of Innovation about Big Data, I heard for the first time about Brontobytes – yep, ‘brontobytes’, like that huge dinosaur! This is a huge, huge number, this is almost the biggest of big data.

Just to bring you up to speed sequentially we have a gigabyte, then add three more zeros to get a terabyte, add 3 more zeros to get a petabyte, then you go exabyte, zettabyte, yottabyte and then the aforementioned brontobyte. This is 10 to the power 27, or more zeros than you can comfortably write down or even quantify. Just to put this into context 10 to the power 24, the yottabyte, is the total strange capability of 250 trillion DVDs. So a brontobyte is 1000 yottabytes, in other words – massive!

These are immense numbers and in order to access (never mind find!) data that will be stored in this quantity, new computing methods will be required. HP Labs are already innovating, developing and researching into photonics to replace the copper connections within todays computers, servers, and tablets with optical connections – the much heralded computing at the speed of light, literally. In order to do this they have a research project called ‘The Machine’ to make computing more efficient by removing the 80% of time that computers spend on managing their environment – as in moving data from one place to another – and getting it to perform the task at hand. The technology and the new thinking (the innovation) needed to do this is immense, but with HP behind it and their enviable track record in innovation, it will no doubt come to market.

So will ‘The Machine’ be able to crunch through brontobytes at the speed of light? That’s the aim, that’s the dream, in fact that’s the future market need.

Oh and just in case you were wondering, the brontobyte will go the way of it’s dinosaur namesake, as the ‘geopbyte’ – that’s 10 to the power 30, is already in HP’s sights. Big Data will continue to get bigger!

****

Read more about ‘The Machine’ here : http://h30507.www3.hp.com/t5/Cloud-Source-Blog/The-Machine-a-view-of-the-future-of-computing/ba-p/164568#.VCv1ab4rjdk

Stuart Wilkes, Guest Blogger