Cloud backup and the importance of dual copy
Backing up to the cloud provides an offsite backup copy without laborious movement of physical media as was the wont with older backup strategies.
Why backup to cloud storage makes sense
Backing up to the cloud solves a number of problems that plagues conventional backup methods. It provides an offsite backup copy without laborious movement of physical media as was the wont with older backup strategies. It also provides network segmentation – creating an “api gap” which is an effective barrier against the spread of ransomware. Like all cloud based technologies – the benefits of elasticity, pay-for-what-you-use, and opex (vs capex) payments are all good reasons businesses have moved to cloud based backups.
One downside with cloud-based backups however, is network latency. Many technologies have been put in place to solve this problem – like incremental backups, data compression, de-duplication etc. to try and reduce the foot print of data that has to travel over the network. With the help of these technologies, once a one-time full backup is complete, most devices settle into a steady-state incremental backup mode – which is not nearly as punishing in terms of network bandwidth.
However, restoration of large volumes of data still remains a challenge. When restoring an entire device which could easily be multiple GB (or TB) of data – network latencies and unreliability can be a deal-breaker. Many modern backup solutions allow for partial restores – i.e. they let you restore the most important files and folders you want first. But when a user wants all their data back in a short amount of time – there isn’t an easy answer.
Dual copy is a mechanism that works around this problem by simply ensuring a local copy of the backup data – in addition to the cloud copy. When a backup is performed, each data file is written to the cloud first (the less reliable target) and then written to a local storage target. The backup is deemed complete once both copies are committed and complete.
Dual copy is a mechanism that works around this problem by simply ensuring a local copy of the backup data – in addition to the cloud copy.
Restores from local targets are obviously faster – and it helps satisfy restore requests a lot quicker.
Keeping a local copy of your backup is no different than what the age-old 3-2-1 rule of backups has always espoused. Have 2 versions of your data locally available, and one version offsite. Usually, the way this is accomplished is to take the first version of every file (i.e. the one on your primary system), make a second copy of that to a local USB disk, and send a 3rd version using commercial grade backup software and take that off site on removable media (like tape).
A backup solution, like Parablu’s BluVault, that allows a dual copy as described above, basically automates this process – so it is completely hands-off. You accomplish the 3-2-1 rule every day as your backups run and faithfully make 2 secondary copies of all your data – one local and one in the cloud.
At Parablu we specialize in building data management solutions like Backup. We offer solutions that are hosted as well as on-premise – so we know these pros and cons really well. Our unique solutions can also integrate with existing subscriptions you have like Microsoft 365 or Google G-Suite and save you a ton in terms of the storage costs associated with Backup.
Write to us at firstname.lastname@example.org to learn more.