On November 29, 2018, Digital Journal released discussion of predictive analysis and big data which states: “It is hoped that the big data analytics approach can be used to gain improved predictions for tornadoes.” Based on its proposition, big data seems to be the right choice for predictive analysis; but, does it really help in natural disaster recovery?
The Identity Theft Resource Center (ITRC), end-of-year data breach report indicates “there was a 126 percent increase from 2017 (197,612,748 records exposed) to 2018 (446,515,334 records exposed).” Another report released by Dynamic Technologies states: “hardware failures cause 45% of total unplanned downtime, loss of power (35%), software failure (34%), data corruption (24%), external security breaches (23%), and accidental user error (20%)” and “75% of small businesses have no disaster recovery plan.”
In 2009, Symantec Corporation released a survey indicating that “93 percent of US organizations had to execute disaster recovery plans and the average cost of implementing disaster recovery plans for each downtime incident was $287,000.”
Based on these statistics, disaster recovery provides very little assurance. However, many businesses in the area of medical, government, criminal justice, education, property, and financial services are required to have disaster recovery plans which include manual and automated procedures for backup / restore and data breach. Predictive analysis features are just another necessary evil…