Major corporations are sitting up and paying attention to IDC. Having completed a major study of the back-up patterns of large corporates, the research giant has hit on a simple way to create massive savings. Can these techniques be useful for KitGuru readers as well?
First, let’s just be clear about something. All things being equal, the more you back-up, the more it costs. Let’s all assume that statement is true, it will make everything else so much easier.
KitGuru has been sent an evaluation copy of a tool called System Mechanic from Julia Zamorska and the SM loving people from Iolo in California. We’ll take the product apart next month, but – for now – it has a useful feature that will allow us to evaluate IDC’s claims. That feature is called ‘Find Duplicate Files’ and it does exactly what it says on the tin.
We clicked ‘Go’ anyway and left the programme alone for 6 minutes to do its business.
If we had set a back-up off and running, then it would take more space, which could require bigger media. It would also take longer, which uses more resources like (a) our time checking it and (b) the electricity to make everything work.
As you might expect, the IDC analysis was completed on a much bigger scale. Here are some of the more interesting facts that Richard Villars and Eric Hatcher found when they began this EMC-sponsored research.
- People are using more and more computers to store more and more information [no sh*t Sherlock - Ed]
- Often, bigger files are considered more important. As a result they are backed up more frequently, under a range of slightly different names and over a longer period of time. CAD files were used as an example
- Overall, major corporations saved $1m each, per annum, by having a significantly more intelligent back-up and recovery solution
- While a significant reduction in back-up time was the most important cost-saving item (de-duplication wonderland starts here), there were also major cost benefits from simpler management, less auditing and faster restores
The conclusion is clear. No matter how big or small your operation, having disk-based data deduplication and replication can have a dramatic impact on the costs and problems associated with backup/recovery.
Having files more than once is fine, but – for maximum efficiency – it should be a single copy and part of a structured back-up.
If you want to have a look at how much duplication you have yourself, then grab a copy of System Mechanic from here and try the ‘Find Duplicates’ option. We’re not going to say if the rest of the package is good just yet (no matter how much Julia Zamorska recommends SM), but this option is certainly interesting.
KitGuru says: The real kicker here is not the back-up itself, but the restore. If you encounter a problem and need to recover your system – then having 30% less files will mean you are back up and running much quicker. Surely that’s the biggest benefit of all?
Comment below or in the KitGuru forum.