Standard-IA holds data that's not accessed often, but when it's needed, it's needed fast, not like the chilled-out pace of Glacier data retrieval.

Charles Babcock, Editor at Large, Cloud

September 21, 2015

3 Min Read
<p align="left">(Image: pheonix3d/iStockphoto)</p>

10 Cloud Storage Options For Your Personal Files

10 Cloud Storage Options For Your Personal Files


10 Cloud Storage Options For Your Personal Files (Click image for larger view and slideshow.)

Amazon Web Services has announced Standard-IA, a new, lower-priced storage service for rarely accessed data.

The new offering is similar to AWS's Glacier storage, cold storage in Amazon Web Services' Simple Storage Service. Glacier is used for less frequently accessed or infrequently accessed data and objects. Its pricing dips as low as 0.7 cents per GB per month, compared to a more typical 3 cents per GB per month for the first TB of S3 Standard Storage. (S3 Standard storage prices decline gradually as the quantity of data rises above a TB.)

Standard-IA is designed for storing log files, backups, other rarely accessed data, or even for data that may never be accessed but still must be archived. Unlike some forms of archival storage, where speed of retrieval matters little, speed may be important if the data needs to be retrieved for recovery purposes.

Hence, Standard-IA (which stands for Standard, Infrequent Access) is priced at 1.25 cents per GB per month -- less than Standard Storage but above Glacier storage.

[Want to learn more about Amazon's new IT orientation? See Amazon IT Services: No Longer An Oxymoron.]

In a Sept. 16 blog post, Amazon evangelist Jeff Barr said the S3 team studied customer patterns and "found that many AWS customers store backups or log files that are almost never read. Others upload shared documents or raw data for immediate analysis. These files generally see frequent activity right after upload, with a significant drop-off as they age. In most cases, this data is still very important, so durability is a requirement. Although this storage model is characterized by infrequent access, customers still need quick access to their files, so retrieval performance remains as critical as ever."

If data needs to be retrieved, there will be a 1 cent per GB retrieval charge added to the bill, along with the usual data-transfer charges. Another billing peculiarity is a charge for a 128 KB file or object even if the object is smaller than 128 KB.

All three forms of S3 storage -- Standard, Standard-IA, and Glacier -- have the same durability rating: 99.999999999% or eleven nines of durability. The new Standard-IA storage has an availability service level agreement of 99%.

"We believe that this pricing model will make this new storage class very economical for long-term storage, backups, and disaster recovery, while still allowing you to quickly retrieve older data if necessary," Barr wrote.

Standard-IA storage inherits existing S3 features, such as security and access management, data lifecycle policies, cross-region replication, and event notifications, Barr said.

Standard-IA is now available in all AS regions as an S3 option for new objects uploaded on the AWS Management Console.

Customers may also combine the three storage types over a timeline. For example, a user could invoke Standard for the first 30 days of life of newly uploaded data, move it to Standard-IA for 60 days, and then move it to Glacier storage.

Customers that manage their storage through policies can add Standard-IA as a policy to their systems and then invoke it as needed, without changing application code that relies on the policies, Barr said.

He quoted customer Dan MacAskill, CEO and "chief geek" of photo storage service Smug Mug: "With many petabytes of (customer photos) stored on Amazon S3, it's vital that customers have immediate, instant access to any of them at a moment's notice -- even if they haven't been viewed in years. Amazon S3 Standard-IA offers the same high durability and performance as Amazon S3 Standard."

About the Author(s)

Charles Babcock

Editor at Large, Cloud

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelor's degree in journalism. He joined the publication in 2003.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights