What are you looking for ?
Advertise with us
RAIDON

Amazon S3 Holds 566 Billion Objects

Processing 370,000 requests per second at peak times

As of the end of the third quarter of 2011, Amazon Simple Storage Service (Amazon S3) holds more than 566 billion objects and processes more than 370,000 requests per second for them at peak times.

To give some perspective on that number, that is about 82 objects for every person on the planet (according to World Bank Estimates). This is up from 449 billion in Q2 and nearly double the number of objects stored at the end of Q4 last year when the number was 262 billion.

                  The Cloud Scales: Amazon S3 Growth
amazon_540
            Total Number of Objects Stores in Amazon S3
 
These objects are stored by many of hundreds of thousands of customers Amazon has in over 190 countries around the world.

One large European organisation using Amazon S3 for their data sets is the European Space Agency (ESA). It is a research institute looking at the earth, space and the solar system. One important ESA programme, Data User Elements (DUE), uses satellites to collect data and uses Amazon S3 to house and retrieve its images and other end-user products. By using Amazon S3 the ESA is able to provide this data to scientists, governmental agencies, and private organizations around the world. The data is used for a range of purposes, including monitoring the environment, improving the accuracy of weather reporting, and assisting disaster relief agencies.
 
In the UK, start-up and property based website Zoopla! also uses Amazon S3. It offers information and tools to help users make better-informed property buying decisions. Their website calculates property value estimates by analysing millions of data points relating to property sales and home characteristics throughout the UK. This works by comparing relationships between home prices, economic trends and property characteristics in different geographic areas. Estimates are constantly refined, using the most recent data available and a variety of statistical methodologies, in order to provide the most current information on any home. To deal with such a large amount of data Zoopla! uses Amazon S3 heavily. Currently, all data is stored in Amazon S3 and every database table across all databases is encrypted and uploaded to Amazon S3 multiple times per day. Amazon S3 also stores all user and vendor submitted images and data files for Zoopla!
 
Another European user of Amazon S3 is Swisstopo, the Swiss Federal Office of Topography. It is responsible for Switzerland’s geographical reference and mapping data and uses Amazon S3 to run 40 geographic information system applications and web map services. As Switzerland is such a mountainous, rugged country, accurate maps and geographical information is important for everyone from recreational skiers through to the emergency services. From Amazon S3 Swisstopo serves up to 30,000 unique visitors per day. This equates to data transfer of 8 TB per month, up to 1,300 delivered map tiles per second, coming from a repository of 250,000,000 preproduced map tiles – all stored in Amazon S3.

Comments

Read also on Storage News, the blog of Maarten Vink, an article dated October 17, 2009, but really interesting:
Amazon provisioning 8 petabytes of storage per day

In this article, the average object size for data stored in S3 is about a megabyte as estimated by Anil Gupta. Each object, up to 5TB, is accompanied by up to 2KB of metadata.

Total capacity of 566 billion objects is then around 566 billion of megabytes or 566 exabytes of raw capacity, and probably much more for the protection and probably compensated with some form of compression. Data are redundantly stored in multiple physical locations.

To store these 566 exabytes, you need 566 million 1TB HDDs, the equivalent of more than three quarters of the current global production of all HDD manufacturers.

Amazon S3 has storage sites in Ireland, Northern California, Northern Virginia, Oregon, Singapore and Tokyo.


Correction (October 14, 2011)
We have received this email from a faithful reader:

Jean-Jacques,

I liked your interesting article on Amazon S3. I just wanted to point out there's a "mega-error" in your calculation of disk capacity required. 566 billion megabytes is equivalent to 566 petabytes, not 566 exabytes. So the disk capacity required is only 566,000 1TB disk drives, rather than the 566 million drives you cite. Still a lot of drives but thankfully not three quarters of the world's production!

Best regards,


Mark O'Malley, Strategic Marketing Manager, Quantum


Editor's note: Since high school, I have difficulties with multiple triple zero numbers. And even after my master's degrees in Economics and Computer Science, both from the University of Paris, it never stopped. It's probably a psychological problem. Coming from my childhood? I don't know. I'll have to consult a psychologist. Sorry for this horrible and unacceptable mistake.

Articles_bottom
ExaGrid
ATTOtarget="_blank"
OPEN-E