Friday 25 November 2011

Backups, art that needs reinvention

Some of the articles you see about data lost in the cloud is beyond belief. There is no excuse for loosing data that was stored more than 24 hours before the problem happened.  Most storage users will have a few snapshots and a dr tested way of restoring them.  The problem comes when you go beyond the snapshot that is still on disk.  Backup of snapshots to other medium is still in its infancy.  The most prominent of backup solutions jsut don’t have it in them  And I have seen virtual server systems presented as complete solutions without a thought for how to get the data back if the thing burned down or, currently more likely, was drowned in a flood.  There is a job here for a specialist in deduping, with the added flavour of a couple of extra copies.

There is a tendency to not treat virtual servers as real servers.  Of course you can restore all the physical servers. But what about the virtual ones.  With dormant or little used virtual servers a lot of them can fit on a few physical hosts.  But the total data can still be the same as if each server was a separate physical.  If you haven’t backed it all up, you need to at a minimum have a definite restorable master and a record of all the steps taken to create each one.
We should not either forget the data people bring around with them. As laptops get ever more capable, most now more powerful than servers where 4 years ago. Developers like to have it all at hand.  A very important part of that time critical project might has its only copy on a thing thrown hither and dither every morning and evening.  Greatly encouraged by the cheap developer tools  licensing we see emerge as a teaser to get more people onboard.  And developers never where the first to think about what happens when things go wrong, or whether that online storage deal included a quantifiable and guaranteed backup/restore.

Often the issue is it takes a long time for a user to discover that their data is actually no longer there.  Today even the smallest of user can have thousands of files.  And since nobody longer learns about file system and folders they never see them except when they need them. It can take months or years if they are only used at the annual budget time or multi yearly planning stage. For that amount of data/iterations it is/was often uneconomical to store it all on disks.  Besides your auditor probably still loves the tape.  

We also have the fast pace of the technology. A much used refresh cycle is 3 to 4 years due to the rapid rise in hardware support costs after the initial contracted support period.  But the requirement is that financial data is to be stored for 7 years.   Ask your IT department if they can restore you a 7 year old backup.  Even if they have the tapes do they have the drives to restore them with or the system to restore them on to.  Not such a large problem if the software system is still in use and the data stored in a database.  They are easy to migrate with the hardware refresh as long as you haven’t segregated out to much of the old to fit the new.  Still you can always add some more modern storage to get those data back in, if you planned for that eventuality in the first place.

No comments:

Post a Comment