Efficiency A Key Objective For 2009
2009, more so than any year, IT professionals are looking for ways to drive out costs. Technologies like deduplication, compression and server virtualization all try to lower the IT expenditures and these technologies have been successful at doing just that. The challenge however is that each of these technologies potentially compounds the challenge of making IT Operations more efficient by putting more workload in the same space.
2009, more so than any year, IT professionals are looking for ways to drive out costs. Technologies like deduplication, compression and server virtualization all try to lower the IT expenditures and these technologies have been successful at doing just that. The challenge however is that each of these technologies potentially compounds the challenge of making IT Operations more efficient by putting more workload in the same space.To deal with this IT Professionals need new systems, tools and processes that allow them to be more efficient in parallel to projects that allow them to cram more workload into the same amount of space. As I mentioned in our previous entry, archiving is a solution that optimizes space while at the same time reduces storage expenditures. Beyond that what is needed is a series of tools and processes to help monitor and optimize the environment as it evolves and changes. Tools like those from Tek-Tools, Vizioncore and Virtual Instruments are examples.
The first step is to establish the baseline, if possible prior to any other IT consolidation occurring. This allows you to know what you have and where it is prior to optimizing it. Many of the tools will also help with movement of the environment. For example in the virtualization world they will help migrate physical servers to virtual servers or in storage they will move old data of of primary storage, optimize it and then store it to secondary storage like Ocarina Networks can.
Most importantly as these projects commence and mature they provide the critical real time monitoring of the environment that gives a single view of the environment from the virtual machine out to the storage it connects to. This monitoring is a key component in efficiency; as we being to optimize workloads and push servers and storage to high levels of utilization, there is less margin for error. A sudden spike in the environment has to be recognized as early as possible and the solution should make corrective recommendations to help resolve the issue.
Without staff efficiency measures being implemented, optimizing the environment with virtualization, deduplication and compression can actually compound the problem. Since more work is being done in the same amount of space there is not the visual representation that more work is still being done. To the manager walking by the data center looks the same size or even smaller, meanwhile the IT administrator is getting buried.
Eventually this get recognized and if more staff is hired, then much of the cost savings in consolidation is lost. Thats why it makes sense to work on staff efficiency first by improving their tools set prior to or at least in parallel to consolidation.
Track us on Twitter: http://twitter.com/storageswiss.
Subscribe to our RSS feed.
George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.
About the Author
You May Also Like