Find The Right Data Archive Method

Backup-as-archive is an increasingly viable storage solution, especially for companies that don't have strict data retention requirements.

George Crump, President, Storage Switzerland

December 10, 2012

3 Min Read
Dark Reading logo in a gray background | Dark Reading

In almost every study I've done and seen, one fact remains consistent: at least 75% of the data that's stored on primary storage has not been accessed for more than one year.

This data really should be archived. Products that move such data transparently to an archive have improved dramatically in recent years, and it may be time to reconsider data archiving. In this column I'll look at the competing methods for archiving data; in my next column I'll look at some of the competing archive targets.

Backup As Archive

While not the ideal location, backup is the archive for many companies. Archive purists will not agree with me, but I believe backup products can in some cases solve the archive need, especially for companies that don't need to meet government regulations or other requirements on retaining data. Backup may also be the most realistic way to archive data since most companies are already doing it. As I discussed in this article, many organizations count on backups for long-term data retention instead of using a separate archive product.

[ How to choose a virtualization-friendly backup app: See Big Backup Challenge Of Medium-Size Data Centers. ]

One reason backup archiving has lately gained legitimacy is that backup software can now create larger meta-data tables (data about the data in the backup) and can better search that data. Some products now even offer content search capabilities. Improvements in backup products' scalability are another reason the backup-as-archive approach is more practical than it has been.

The key limiting factor for disk backup products has not been how many disks they can add to the shelf, but how far their deduplication tables scale. This is another meta-data issue. One approach we've seen vendors take is to segment their deduplication table into multiple tables as the data ages. This lowers deduplication effectiveness, but allows for longer storage without impacting current backup performance due to lengthy deduplication table lookups. Eventually, though, deduplication engines will need to be improved in order to scale, as discussed in this article.

One thing we don't typically see in the backup-as-archive method is the problem cited above: removal of data from primary storage. Backup-as-archive is best for companies that are less concerned with how much data they are storing on primary storage and primarily need a way to retain information in case they need it later.

Archive As Archive

Because backup as a long-term retention area is becoming more viable, archive solutions are taking a different approach. Just as solutions that move data from primary storage to archive storage are improving, so is the ability to browse the archive independently of a specific archive application. Most archives now simply show up as network mount. They also have the ability to leverage tape and disk for excellent restore performance and maximum cost-effectiveness.

The key to archive success is to move it upstream, where it can play a more active role in primary storage. Because of the high level of transparency and fast recovery time, archiving data after 90 days of data inactivity will likely have no impact on productivity -- and maximum impact on cost reduction.

There's a lot to be gained by removing 75% or more of your data from primary storage: backups will get faster and investment in higher-speed storage (SSD) for the remaining data can be justified. Data integrity will also improve since most archive solutions perform ongoing data integrity checks, protecting you from silent data corruption (bit rot).

In my next column I'll look at some of the products that are competing for your archive dollars: disk appliances, object storage systems, cloud storage providers, and, of course, tape.

Storing and protecting data are critical components of any successful cloud solution. Join our webcast, Cloud Storage Drivers: Auto-provisioning, Virtualization, Encryption, to stay ahead of the curve on automated and self-service storage, enterprise class data protection and service level management. Watch now or bookmark for later.

About the Author

George Crump

President, Storage Switzerland

George Crump is president and founder of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. With 25 years of experience designing storage solutions for datacenters across the US, he has seen the birth of such technologies as RAID, NAS, and SAN. Prior to founding Storage Switzerland, he was CTO at one the nation’s largest storage integrators, where he was in charge of technology testing, integration, and product selection. George is responsible for the storage blog on InformationWeek's website and is a regular contributor to publications such as Byte and Switch, SearchStorage, eWeek, SearchServerVirtualizaiton, and SearchDataBackup.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights