Monday, December 10, 2007

Data Storage

Here's a fascinating data storage challenge. According to New Scientist magazine (1), the new Large Hadron Collider (LHC) in CERN Geneva, Switzerland, is expected to churn out 450 million GB of data during the next 15 years. But what exactly does that mean? Well, broken down it equates to an average of 9.5 GB/min round the clock. And just how big is a gigabyte? 1 Gigabyte could hold the contents of about 10 yards of books on a shelf (2). So imagine being tossed almost 100 yards worth of books per minute! If you are starting to talk in tera bytes (3) that's 82 TB/day. Can you imagine storing that amount of data? How long would a backup take!

It does raise a good question though. Are we storing our data efficiently? While storing it digitally is more cost effective than storing physical documents, could it be that we have duplicated data across the network? It pays to have a way of organizing data so that this doesn't occur. As you can see, for small amounts of data this is not a problem but as it grows, well..., just look at what smashing atoms gets you.


1 Paul Marks, 8 December 2007
2 http://www.whatsabyte.com/
3 1TB = thousand gigabytes
4 (450,000,000/3)/8,765 = 3,423 GB/hour

1 comment:

Unknown said...

Thanks for the nice blog...
Hey friend you can store your important data very easily at some online document storage.