California Wildfire Occurrences Highest in October

California Wildfire Occurrences Highest in October

Wildfire occurrences highest in October, California


wildfire-infographic-webWildfires have taken a top spot in the news as of late. They cost American taxpayers between $20 billion and $100 billion every year, in expenses that range from air pollution and soil degradation to public health challenges and loss of human life.

If it weren’t for humans, wildfires might be more manageable. Four of the top five causes of wildfires are man-made: campfires, smoking, outdoor burning, vehicle exhausts and brakes – and the one natural cause, lightning.

No state is immune from wildfires, in fact they have occurred in all 50 states but California take the top spot, suffering the most and also the worst of these deadly fires. In 2012, the amount of land burned in California was seven times as much as all the other western states combined. Californians spent more than twice as much as the other states in fire suppression, too, well over a billion dollars.

Not surprisingly, the five most costly wildfires in United States history all happened in California. Occurring between 1991 and 2007, those fires did a total of more than $6.5 billion in damages. During 2003, in San Diego alone, almost 400,000 acres of land were burned and more than 3000 homes lost — along with almost $6B in lost business, medical costs and transportation system, watershed protection and infrastructure damage.

Try This Interactive Disaster Map

Try This Interactive Disaster Map

Try This Interactive Disaster Map

Any hurricane is cause for alarm and we’re glad hurricane Arthur dissolved without hugely significant damage to the US as compared to some of Arthur’s predecessors.  However, the chances of another hurricane striking that same area are noteworthy as based on our research portrayed within this infographic.

As you can see, we are enthusiastic about maps layered with data, and the interesting hypotheses that can be made from them. Because we are in the business of disaster recovery, we are especially intrigued by maps that illustrate high risk areas for natural disasters. It should serve as no surprise that when we found this interactive map from National Geographic, we just had to share it.

Called a “Doomsday Disaster Map,” it’s probably the inspiration for numerous future Hollywood blockbusters. It’s no wonder the map is the basis for NatGeo’s primetime show called “Doomsday Preppers.” The show features “ordinary” people who are preparing for the end of the world through many ingenious – and extreme — means.

Whether you tune into the show or not, this map is an engaging examination of how different disasters could unfold across the US. With a few clicks you can speculate how a super volcano in Yellowstone, a mega earthquake, an agriculture collapse, solar flare, nuclear fallout, or even a pandemic may affect you.  Interesting at the least, memorable for sure.

These are exactly the type opf scenarios that we advise preparing for with a cloud disaster recovery plan. We apologize in advance if you are awake at 3 a.m. still thinking about it, but take a look.


Predicting Hurricanes

Predicting Hurricanes

Predicting Hurricanes



Even though hurricanes mainly threaten the US Gulf and East Coasts, of all the natural disasters that happen in America, hurricanes are the most costly and destructive. Hurricanes can strike anywhere from Southern Texas to New England, but by far, Louisiana gets the worst of it. Ten counties in Louisiana get pummeled by hurricanes an average of every 2.9 years.

The magnitude of impact for hurricanes is massive. Hurricane Katrina — only a category 3 hurricane when it hit land — caused $125B in losses and took the lives of 1,833 people. Hurricane Sandy was a mere category 2 when it hit land and it managed to cause $68B in damages and killed 286 people. Comparatively, a category 5 hurricane can deliver winds in excess of 155 mph, fast enough to knock down all the trees in its path and completely demolish substantial buildings.

But as dangerous as the winds of a hurricane are, the greatest danger they pose is from the storm surge they create. A category five hurricane can produce a storm surge up to 18’ high, enough to cause serious flooding for miles inland, damaging everything below 15’ above sea level.

Hurricanes can have a diameter of 400 – 500 miles, the eye can be 20 miles across and they often spin off tornadoes, which only complicates the situation further.

Global Data Vault has developed this infographic from various sources but largely from data provided by FEMA. We hope this serves to educate the business community, raise awareness and ultimately to enhance the general state of preparedness.

Through better planning and preparation, hurricane losses – both financial and human – can be decreased.


A Tornado’s Costly Path

A Tornado’s Costly Path


Recent events remind us again that tornados often have a sad and deadly cost for the victims in its path. even an F-3 tornado can cause devastation and the total destruction of everything in its path. Homes are reduced to matchsticks, office buildings to piles of rubble. In 1997, the tornado that hit Jarrell, Texas, actually ripped the pavement right off a roadway. Tornadoes can destroy lives and possessions outright, and the after effect of their cost to a business can be felt long after the wind dies down.

The cost to clean up and restore the tornado-ravaged areas is rather surprising. Any tornado can steal irreplaceable data or keepsakes, however there have been only a few tornados that create horrific damage that captivates the news for weeks and rank off the charts in recovery costs. F-5 tornadoes have the potential to do damage of a billion dollars per tornado. Here’s where the numbers are interesting. In today’s money, three quarters of all tornadoes between 1953 and 2013, resulted in damages under $50,000. 300 of the 1200 tornadoes that occurred in this period did more than $50,000,000 in damages. Six percent created over $500,000 in damages. Yet, the top nine in this period did more than $6,000,000,000 in cumulative damages. The tornado that hit Oklahoma City in 2013 created, by itself, more than $2,000,000,000 in damages.

Tornados have occurred on every continent except Antarctica but, by far, the overwhelming majority occur in “Tornado Alley” in the United States, an area that includes parts of Texas, Oklahoma, Kansas, Nebraska and South Dakota, with some areas being more vulnerable than others. Between 1953 and 2013, 32 counties in the USA suffered 3 tornado disasters; 2 counties in OK suffered 4, 3 counties in Arkansas suffered 5 and Lonoke County in Central Arkansas was hit by 6 tornados.

One quarter of businesses that close after a disaster never reopen. All too often, that’s because they lose their data and recreating it, if possible, is simply cost-prohibitive. The only failsafe data disaster recovery lies in having networks and data backed up or replicated in more than one geographically and environmentally safe, remote location. To learn how Global Data Vault can provide the data protection to fit your needs and your budget, contact us.

Sources: Economic and Societal Impacts of Tornadoes, CoreLogic Storm Prediction Center

Information Destruction Through History

Information Destruction Through History

Information Destruction Through History

Information the most valuable commodity in the world. All human progress depends on the accumulation and preservation of information. When information is lost, human progress suffers. This infographic displays some of the most significant loses of information human civilization has suffered.


Throughout the ages, it has happened again and again. Whole libraries of clay tablets, papyrus scrolls, bark codexes and paper books have been destroyed by natural disasters, fire and war. The Royal Library of Alexandria, where the accumulated knowledge of ancient scientists, physicians and philosophers was stored, was destroyed by fire. The destruction likely started during Caesar’s Civil War when Julius Caesar purposefully set his own ships ablaze, and many scholars believe the library suffered numerous other tragic fires throughout history. More than 120,000 volumes written by classical Greek and Roman authors were lost when fire destroyed the library at Constantinople in 473A.D.. Virtually all of the codexes recording the history, beliefs and sciences of the Maya were intentionally destroyed by the Spanish as works of the devil. In World War II libraries containing millions of books were destroyed as strategic acts of war.

During the Yugoslav Wars of the 1990’s, the 17,000 volumes of the Oriental Institute in Sarajevo were directly targeted, along with the National Museum and National Library. The Iraq War saw the destruction of more than 400,000 books in the Iraq National Library, including priceless records of the world’s first urban, literate civilization. On 9/11, 21 libraries inside the World Trade Center, the records of 3,000 to 4,000 active cases before the Securities and Exchange Commission, files belonging to the CIA and EEOC, U.S. trade documents dating back to the 1840s, the offices and archives of Helen Keller International, $100,000,000 in privately owned artworks, thousands of photo negatives of JFK and more than 900,000 archeological artifacts were all lost.*

In our estimate the equivalent of 34,524.8GB of information created by humans that we can quantify within reasonable certainty has been destroyed throughout our history. We reached this figure by using data from the Amazon Kindle 3 which claims to hold 3,500 books per 4GB.

There are many well known accounts of destroyed libraries and houses of information that we were unable to include in this figure due to a lack of reputable sources on the amounts of information destroyed. Those instances include The Library of Antioch in 363B.C., The Royal Library of Ashurbanupa in 600 B.C., Imperial Library of Ctesiphon in  754 A.D., The Library at Nalanda University in 1193 A.D., and the House of Wisdom in 1258 A.D.

As most information today is stored as electronic data, it’s not as vulnerable to fire but very vulnerable to new threats. Today’s data is vulnerable to being stolen, destroyed or compromised by disgruntled employees, competitors, terrorists, criminals and malicious hackers. Fortunately, though, electronic data can be copied, encrypted, backed-up and stored in multiple locations, as it is being created, far easier than copying scrolls by candlelight with a quill pen, and easily retrieved and restored in case of a disaster.

And the destruction continues.





Data Center Exposure and Recovery in New York City

Data Center Exposure and Recovery in New York City

Hurricane Sandy provided a fascinating opportunity to study the both the level of disaster planning and the resilience of New York City data centers. This article will examine a) what actually happened, b) what was the risk, and c) what are the lessons learned.

What Actually Happened?

Simply put, data centers in New York were caught off guard. Consider these incidents.

Internap and Peer 1, located at 75 Broad Street, suffered basement-level flooding which knocked out diesel fuel pumps.


Datagram, located at 33 Whitehall, experienced the exact same problem – 5 feet of water in the basement. As a result several high profile blogs and numerous websites went dark.hurricane-sandy

Both of these facilities are located a Zone A flood zone. Zone A is FEMA’s second highest risk category.

Then there were fuel supply issues. Fog Creek who makes and hosts Trello, Copilot and other popular platforms is in Peer 1 had to assemble a bucket brigade to carry diesel fuel up 17 stories to refuel a generator at Peer 1. As a precaution Trello was moved to Amazon Web Services and it seems to have suffered limited downtime, but the bucket brigade was required.

Shoretel, the VoIP provider, had 3 data centers – all in lower Manhattan, including 75 Broad St which did successfully switch over to generator power but due to “city restrictions” they had shut the generators down. 700 customers went down.

Fortunately, things did not get worse for Fog Creek, but carrying 5 gallon buckets of diesel fuel up 17 stories in a building with power problems strikes us as a recipe for something truly horrible.


squarespace-75broad-bucketTeams from Squarespace fill buckets with diesel fuel to haul them up 17 stories to the generator keeping the data center online. Staff from Peer 1, Squarespace and Fog Creek Software have formed this unusual Internet bucket brigade. (Photo via Squarespace)

A typical rack of servers requires 5 to 10 KW of power including cooling/HVAC. Typical data centers range in size from 5,000 to 40,000 square feet. A mid-sized facility at 20,000 SqFt would house about 600 racks. That equates to roughly 5 megawatts (MW) of power. A reasonably efficient diesel generator would require roughly 200 gallons of diesel per hour to push out 5 megawatts – that’s a bit over 3 gallons per minute.

Typically data centers tell us they have 1 week of diesel onsite and a resupply contract. A full week for a 20,000 SqFt data center is 34,000 gallons. We suspect that in lower Manhattan, the standard was more like 1 day. Then resupply problems hit because of the street flooding, and road and bridge closures.


What was the Risk?

The Mid-Atlantic States do not see nearly as many hurricanes as the Southeast and the Gulf Coast of the United States. The average return period for hurricanes within 50 miles of New York City is 18 to 19 years.

For the largest part of Hurricane season the Typical Hurricane Tracks, as observed by NOAA, take these storms out to see at the more northern latitudes of the NYC area.

Here are the July, August and September typical tracks:




But look at how this changes in October:



And notice how closely Hurricane Sandy lined up with the typical October track.



Finally, what about the frequency of storm origin in October? Compare below the frequency map for August 21 – 31 origin, which is the peak of Hurricane Season, to the October 11 – 20 origin map below:



You can see that activity is less in October, but it’s hardly dormant as it is a few weeks later:


Just as August and September are the periods of greatest risk in the Southeast and the Gulf Coast, October clearly presents the greatest risk of hurricanes in NYC.

What is the solution?

If these providers had built to the following standards, downtime would have been minimized:

  • One week of fuel for standby power onsite
  • Resupply plan for fuel in place – or
  • A redundant or backup site more than several hundred miles away

For any disaster recovery, hosting or colocation solution, we would look to the Uptime Institute who publishes the Data Center Site Infrastructure Tier Standard for Operational Sustainability.

Based on their standard, we’d offer the following. Red indicates higher risk profile of Lower Manhattan.

Disaster Risk Component Higher Risk Lower Risk
Flooding and Tsunami < 100 Year Flood Plain > 100 Year Flood Plain
Hurricanes and Tornadoes High Medium
Seismic Activity Zone 3 or 4 Zone 2A or 2B
Airport/Military Airfield < 3 miles from active runway > 3 miles from active runway
Adjacent Properties Chemical plant, etc. Office buildings, land
Transportation Corridors < 1 mile > 1 mile


To review your site’s risk of various natural disasters, see our Natural Disaster Risk Maps.