In the wake of Microsoft’s September 4 – September 5 South Central U. S. outage for Office 365 and Azure, it’s worth asking, should you be concerned with backup of your Office 365 data?
It’s your data, and while Microsoft normally does a good job with protecting it, it’s ultimately your responsibility. If it’s lost, especially if it’s not lost as a result of their failure, don’t expect Microsoft to race off to your rescue.
Please let us know if you will attend. We’ll be there in force. Here is what you can accomplish at VeeamON 2018:
• Gain Access to over 60 breakout sessions covering the latest in data management and Hyper-Availability
• Connect with experts and get hands-on experience with Veeam Solutions
• Attend VMCE trainings or sit for my VMCE certification onsite during the conference
• Discover Solutions in Veeam’s Partner Expo Lounge where over 50 sponsors will be exhibiting
• Network with peers and industry experts to learn what is and is not working best in their industries
Whether you have completely migrated to Office 365, or have a hybrid Exchange and Office 365 deployment, your business objectives remain the same. You must remain in control of your data and you need Office 365 backup and recovery at your fingertips.
One of the most vulnerable situations for an IT Admin is when their only option is to send a support ticket and wait. Don’t let this be you.
Backup for Microsoft Office 365 mitigates the risk of losing access to your Exchange Online email data and ensures Availability to your users.
With Office 365, it’s your data
Microsoft Office 365 enables you to work anywhere, anytime, without the need to maintain your own email infrastructure. It also provides a great way to minimize your on-premises footprint and free up IT resources. Even though Microsoft takes on much of the management responsibility, this doesn’t replace the need to have a local backup of your email data.
With Office 365, it’s your data — you control it — and it is your responsibility to protect it. Utilizing Backup for Microsoft Office 365, allows you to:
Empower your IT staff to take control of your organization’s Office 365 data
Reduce the time and effort needed to find and restore email data
Protect against data loss scenarios that are not covered by Microsoft
Facilitate the migration of email data between Office 365 and on-premises Exchange
Backup Office 365 email
You need to securely backup Office 365 email data back to your environment for a variety of reasons (i.e. to follow the 3-2-1 Rule of backup, to facilitate eDiscovery and to meet internal policies and compliance requirements). The most important reason being — for the peace-of-mind that comes from knowing you’ll be able to restore your users’ data when needed!
With Backup for Microsoft Office 365, you can retrieve Office 365 Exchange Online mailbox items (email, calendar and contacts*) from a cloud-based instance of Office 365 and uniquely back up this mailbox data into the same format that Microsoft Exchange uses natively — an archive database based on Extensible Storage Engine (ESE), also known as the Jet Blue database.
Restore Office 365 email, calendars, and contacts
Never settle for less than fast, efficient recovery of Office 365 mailbox items with best-of-breed granularity.
Veeam Explorer™ for Microsoft Exchange allows for quick search and recovery of individual mailbox items residing in either archived Office 365 content or on-premises Exchange backups. Mailbox items can be restored directly to an Office 365 mailbox, an on-premises Exchange mailbox, saved as a file, emailed as an attachment or exported as a PST.
eDiscovery of Office 365 email archives
Without a local copy of your data, retrieving emails for regulatory or compliance reasons can be costly and time consuming, and can ultimately present a major disruption to normal business operations.
But, not with Veeam! You can leverage the familiar, advanced search capabilities and the flexible recovery and export options of Veeam Explorer for Microsoft Exchange to perform eDiscovery on Office 365 email archives — just as easily as you would today with your on-premises Exchange server backup.
To start a free trial of Office 365 Backup, contact firstname.lastname@example.org
Knowing and planning for an appropriate level of bandwidth is a key component of every DRaaS solution. In our most common DRaaS implementation, the data is moved from local repositories at the customer site over the internet or WAN to our data centers once per day, even though that data may contain multiple restore points.
Here’s a diagram showing how the data moves.
The key to an effective solution is knowing whether the remote data center is getting the data on a timely basis. We measure this daily, and we call this measurement “quality”. In other words, how current is the remote data. If it’s within the desired restore point objective (RPO) as agreed in the service level agreement (SLA), then we regard it as being “in SLA”.
It’s essential that the bandwidth between the customer site and the remote data center is sufficient to move the data that’s changed every day quickly enough that it’s in the remote data center in time to support the agreed or planned RPO. See our discussion of RPO here. As a general rule for most sites, we find that 4 to 6 hours is “quickly enough”, and this is usually scheduled overnight.
Why not use 8 or 12 hours? in more complex implementations there are other events, like multiple backup jobs, each of which use resources and must finish. So 4 to 6 hours is a conservative window of time. Some sites may be able to use longer windows – and therefore less bandwidth – to move the data. Or simply move more data per day in the longer window.
Getting to the Question
So, how much bandwidth is required to move the data in 4 to 6 hours. To answer this question, we need two pieces of data and a bit of math. First, we need to know the total storage in use. This is your disk storage in use across all servers. If you’re running VMware, one way to get this data is from your vSphere console as covered here.
Second,we need to know the daily data change rate. This is easily measured by Veeam ONE as discussed in a previous post or it can be derived from simply looking at the actual size of daily backup files.
Now for the Math
If R is required speed in Mbps (see note 1), D is data changed each day in Gb (see note 1) and T is 6 hours expressed in seconds, then our formula is: R = D / T.
To save you from number crunching, we’ve prepared this chart showing typical storage sizes and change rates and the required bandwidth result:
So at a site with a 1% daily data change rate (after compression and deduplication), which we find is typical, with storage of 5TB, the required bandwidth is 19Mb.
For a deeper dive, check out this great web based calculator at WintelGuy.com.
Note 1: WintelGuy also defines all of the relevant units of measure you may encounter.
Summarizing this Idea
Start with your storage in use
Find your daily data change rate
and calculate or lookup the required bandwidth here
What if my bandwidth is not enough? There are numerous strategies we employ to deal with slower lines, larger data sizes and higher change rates. The best strategy is developed on a case-by-case basis.
Let us know if you have questions. And we welcome your thoughts and real world experiences.
A recent article in the Economist, Atoms and the voids, tells of a data storage breakthrough where individual chlorine atoms were arranged on a sheet of copper to form the code of zeros and ones used in typical binary storage.
While the advancement is expected to result in improvements in speed and simplicity, the cost of the project was large, and the technical requirements were severe. A tunneling electron microscope was used in temperatures cooled by liquid nitrogen of -196◦C. and the process was very slow – read and write speeds of 1-2 minutes per 64 bits.
Not surprisingly, the very idea of storing data within individual atoms can be challenging to understand so the above animation by Stijlbende explains the process used by scientists to store one kilobyte of data on an area 100 nm x 100 nm in size.
But the density achieved was quite impressive: “78 trillion bits per square centimeter, which is hundreds of times better than the current state of the art for computer hard drives.”
And that’s the part we find very interesting – not the density – but the relative density compared to today’s storage.
Data Density in Perspective
Thirty years ago a typical home PC came with a 10MB disk drive. Today, it’s a 1TB drive. That’s an increase in density of 100,000 times.
If arranging individual atoms is only hundreds of times more space efficient than today’s storage technologies, are we approaching limits where density will no longer grow so quickly? One thing is more certain – the rate at which we create data shows no sign of slowing.
The future will be interesting. Could this point to the need for larger storage systems, larger servers, larger data centers?
The Veeam Lunch N Learn in Alpharetta was a great event to introduce the inside sales teams to Global Data Vault and our unique culture of quality. Here are a few photos from our fun day playing “The Vice is Right”: