Top 10 Things You Want in Your Backup Contract (Part 3)
Our previous post covered the second point,"Getting the data off site." This week, we will focus on point three, "Rate of Change."
Disclaimer: Take these as basic starting templates and get local legal advice, as local jurisdictions may require specific changes.
Our previous post covered the second point,"Getting the data off site." This week, we will focus on point three, "Rate of Change."
Disclaimer: Take these as basic starting templates and get local legal advice, as local jurisdictions may require specific changes.
Rate of Change. What does this have to do with my contract? Should I really care? If you have a contract in which you guarantee that their servers are not only protected locally but also through a remote DR or remote file recovery service, then this is very important!
So how much does data change? If I knew a fixed answer to this, I wouldn't have to write a blog for a living! It can vary dramatically. Here are some examples. You can have a very small amount of data, say 80GB worth, in a Lotus Notes database, but it can change by up to 80 percent a day, for a massive 64GB of change. On the other hand, you can have 4TB of data, but because they are all drawings and only one or two get modified a day, you can end up with only 3GB to 5GB total change against the 4TB.
So why is this important? If your solution provides for getting data off site, then this applies to you. The mechanics of this boils down to a combination of bandwidth (upstream, on behalf of the client) combined with the rate of change to determine if you stand a prayer of getting the data off-site in time before the next backup window.
So what should you include? Put the following clauses (or similar) into your contract: The total in-use space by the client is XX GB (or TB)
This is important, as it establishes the starting point of your total commitment (see the first blog for more information on this): The total rate of change will not exceed XX GB (or XX percent) per day, nor exceed XX GB (or XX percent) total growth per year. If client exceeds these amounts XYZ (you) will not be responsible for any loss of data that might result from exceeding these thresholds.
This sets a rate of change that limits the amount of data you are obligated to support contractually. Now, will it still happen? Yes; but it limits your liability since they tried to push too much data down the pipe. A good example of this can be a system that only has about 120GB when you look at it (even though it is a 500GB server). Then, along comes the local admin who decides to copy his laptop (all 130GB of it) onto the server for safekeeping. By doing this every night, it makes the delta of change now 130GB PLUS whatever the company would have had otherwise. You end up with a massive backlog, and you finally end up filling up your local backup appliance because you did not account for these kinds of daily deltas!
Bottom Line: Put in definitions that protect you from the unknown. Next week we will talk about how you need to put parameters on the bandwidth, so you can get the data off site (this goes hand in hand with point No. 3 this week).
If you are interested in finding more about Zenith's TigerCloud with built in business continuity, click HERE.
Rich Reiffer is VP of Cloud Practice at Zenith Infotech. Rich has been in the business of technology since the dark ages starting with Burroughs Corp., spending time with Steve Jobs (NeXT) and Ray Noorda (Novell). Rich has been in the VAR channel since the mid 80's with companies like Inacomp and Businessland finally forming his own company, Trivalent, in 1991. After 20 years of building data centers, etc. Rich has come on board with Zenith to head up the Cloud group. Monthly guest blogs such as this one are part of Talkin' Cloud's annual platinum sponsorship.