Targeting Primary Storage
Primary storage is the next target of optimization. As mentioned in my <a href="http://www.informationweek.com/blog/main/archives/2008/10/your_data_is_no.html">last entry</a>, the growth rate of active data on this storage is small, what is growing is the older data set which, for now, still remains on primary storage. Optimizing primary storage is the most significant step you can make in reducing not only your storage budget but also your storage's use of power and cooling.
Primary storage is the next target of optimization. As mentioned in my last entry, the growth rate of active data on this storage is small, what is growing is the older data set which, for now, still remains on primary storage. Optimizing primary storage is the most significant step you can make in reducing not only your storage budget but also your storage's use of power and cooling.The first step is to understand what you have. It amazes me how, because of rapid data growth, IT professionals no longer have a grasp on what type of storage they have and how it is being utilized. There are a variety of tools available from EMC, Tek-Tools, and others that can help you get a grasp on what your storage looks like and some will even provide trending information.
The next step depends on what your plans for primary storage are this coming year. If a new storage system is in your budget, then I suggest exploring one of those systems that can transparently move data at a block level from Fibre Channel to SATA drives. Provided by companies like Compellent and 3PAR, this gives you a quick entry into better storage management.
The next step is to get this older data off primary storage and on to scalable or power-managed disk archives like those offered by Copan Systems, Nexsan Technologies, or Permabit Technology. To do this, however, is going to require either a manual move of data or some sort of data mover. In previous entries I have written about different types of data movers; global file systems, agent-based movers, and out-of-band data movers. I've never written about manual moves, but I will cover that in an upcoming entry. Moving manually may be one of the simplest and most cost-effective ways to solve this problem.
An interim step between these solutions may be using data deduplication. NetApp can do this now, Riverbed Technology has announced that it is going to do this next year, and Ocarina Networks has deduplication of specific hard-to-optimize data environments, such as graphic images and other types of media.
Lastly, there is an emerging set of solutions that do inline data compression of active data. Companies like Storwize are able to compress active file data in-band. Your compression will vary depending on data type, but we have found there to be little if any performance impact as a result. Even if you only get 2:1 compression, this is a simple, cost-effective way to possibly cut your storage needs in half. Most important, these solutions are compatible, even complementary, to the other solutions. Primary storage shouldn't be the problem that it is. There are plenty of tools to address it and getting control of this environment can have an almost instant ROI, plus make other processes, such as backup, simpler.
Track us on Twitter: http://twitter.com/storageswiss.
Subscribe to our RSS feed.
George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.
About the Author
You May Also Like