On November 29, 2018, Digital Journal released discussion of predictive analysis and big data which states: “It is hoped that the big data analytics approach can be used to gain improved predictions for tornadoes.” Based on its proposition, big data seems to be the right choice for predictive analysis; but, does it really help in natural disaster recovery?
The Identity Theft Resource Center (ITRC), end-of-year data breach report indicates “there was a 126 percent increase from 2017 (197,612,748 records exposed) to 2018 (446,515,334 records exposed).” Another report released by Dynamic Technologies states: “hardware failures cause 45% of total unplanned downtime, loss of power (35%), software failure (34%), data corruption (24%), external security breaches (23%), and accidental user error (20%)” and “75% of small businesses have no disaster recovery plan.”
In 2009, Symantec Corporation released a survey indicating that “93 percent of US organizations had to execute disaster recovery plans and the average cost of implementing disaster recovery plans for each downtime incident was $287,000.”
Based on these statistics, disaster recovery provides very little assurance. However, many businesses in the area of medical, government, criminal justice, education, property, and financial services are required to have disaster recovery plans which include manual and automated procedures for backup / restore and data breach. Predictive analysis features are just another necessary evil…
The threat of artificial intelligence rears its head as recent phenomena targeted toward courtroom proceedings. Its story, covered by DailyMail.com on September 22, 2017, boasts scientific design of “a machine-learning algorithm that can accurately predict over 70 percent of Supreme Court decisions. “
So how much of this technology is artificial intelligence and a threat to society? We can assess this based on previous courtroom decisions. In 1977, the Supreme Court of the United States vacated and remanded decision in Gardner v. Florida. “The Petitioner was denied due process of law when the death sentence was imposed by the Florida Supreme Court, at least in part, on the basis of information that he had no opportunity to deny or explain,” stated Justice Stevens. The information utilized in the case was an informational report which held information of which Gardner was unaware.
On July 13, 2016, a similar report was utilized for sentencing of Eric M. Loomis and evaluated on appeal by the Wisconsin Supreme Court. The report was comprised of a proprietary algorithm utilized by the COMPAS system which scales violence and recidivism risk principles as part of case triage. In its opinion, the Wisconsin Supreme Court stated, “The court of appeals certified the specific question of whether the use of a COMPAS risk assessment at sentencing violates a defendant’s right to due process, either because the proprietary nature of COMPAS prevents defendants from challenging the COMPAS assessment’s scientific validity, or because COMPAS assessments take gender into account.” The court determined, “if used properly, observing the limitations and cautions set forth herein, a circuit court’s consideration of a COMPAS risk assessment at sentencing does not violate a defendant’s right to due process,” further explaining its reasoning for decision was supported by other independent factors; thus, the risk assessment use was not a determinate.