What Is Deduplication And Why Should You Care?

A couple of days ago I was speaking at an event in Dallas and was reminded that sometimes those of us in storage get too wrapped up in, well, storage and that IT professionals have other things to worry about than just storage. I asked the audience how many of them had done anything with deduplication. Only 30% had, although 100% wanted to know more.

George Crump, President, Storage Switzerland

June 5, 2009

3 Min Read
Dark Reading logo in a gray background | Dark Reading

A couple of days ago I was speaking at an event in Dallas and was reminded that sometimes those of us in storage get too wrapped up in, well, storage and that IT professionals have other things to worry about than just storage. I asked the audience how many of them had done anything with deduplication. Only 30% had, although 100% wanted to know more.With all the news about NetApp and EMC in a bidding war to buy Data Domain. It might make sense for us to pause a moment and help explain why these two companies are willing to pay almost $2 billion dollars for the market leading provider of this technology.

Deduplication at its most simple level, examines data, compares it to other data that is already stored and if that data is identical, instead of storing that second copy of data the deduplication technology establishes a link to the original data. It requires significantly less storage space to establish a link than to actually store the file.

Deduplication first gained traction as a technology to enhance disk backup. Without deduplication your disk backup had to scale to store multiple full backups and several weeks worth of incremental backups. Even with the plummeting price of ATA storage the cost to configure a disk array to store even a months worth of backups, let alone the power, cooling and space required by the array was enormous.

If you do backups you know that this data, especially in full backups, is highly redundant and this is where deduplication shines, as a result it was the first market that the technology became a requirement and companies like Data Domain and Avamar became market leaders. Avamar was snatched up by EMC but Data Domain made it all the way to becoming a public company.

What really drove Data Domain's success in the backup space is the ability to replicate backup data to another site. This was an often requested feature when disk to disk backup first started to become viable, but the way and speed at which backup data is created standard replication wouldn't work across normal WAN bandwidth. Deduplication gets around this because it only stores changed or net new data and then only that data needs to be replicated; much more WAN friendly.

These capabilities in backup alone are not enough to justify a $2 billion investment in deduplication technology. What is driving these companies to pay this type of money is what deduplication can do to the rest of the storage spectrum; primary storage and archive storage.

For example if a company armed with deduplication can implement this into primary storage in a way that causes little to no performance impact yet can increase storage efficiencies 60% to 70%, that could get interesting. Imagine if you needed 80TB's of storage but one of your vendors only needed to supply you with 40TB because they had this technology, clearly that vendor would have a significant advantage in winning your business.

Clearly this technology is not limited to Data Domain and there are a host of other vendors that can provide compression, deduplication or both at different levels of storage. The publicity generated by this bidding war obviously helps Data Domain but it also helps many of the other deduplication suppliers.

What all of this should be telling you is that deduplication is important, how it is used and how it is implemented in the various storage tiers matters and this is as good a time as any to begin to learn about and implement the technology.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

About the Author

George Crump

President, Storage Switzerland

George Crump is president and founder of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. With 25 years of experience designing storage solutions for datacenters across the US, he has seen the birth of such technologies as RAID, NAS, and SAN. Prior to founding Storage Switzerland, he was CTO at one the nation’s largest storage integrators, where he was in charge of technology testing, integration, and product selection. George is responsible for the storage blog on InformationWeek's website and is a regular contributor to publications such as Byte and Switch, SearchStorage, eWeek, SearchServerVirtualizaiton, and SearchDataBackup.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights