Sensational as that headline sounds, it is, for the most part true, and yes you read it right; bad data was responsible for the loss of human lives, 346 lives to be precise. In a swift swivel of its stance Boeing through its CEO, recently admitted that ‘erroneous angle of attack information’ from the sensors of its 737 max airplane triggered the activation of its Maneuvering Characteristics Augmentation System (MCAS). MCAS on its part nose-dived the aircraftleading to the crash of Ethiopian Airlines flight 302.

This unfortunate scenario though localized to the Aviation industrylends deep insight on how seemingly innocuous bad data can propagate catastrophic consequences across all areas of life. And while the consequencefor the analytics industry may not manifest as fatalities, there’s no gainsayingthat itseffects can be and almost always does turn out to be critically impactful to all concerned parties.

Complacency in handling data and analytics is not a recipe for success

Although Boeing’s executives might argue differently, the existing body of evidence suggests that the company did have a chance to spot the anomaly in its flight control systems.  Late last year Lion Air Flight JT610 crashed in a very similar manner – plunging into the sea shortly after takeoff.  The heightened scrutiny and skepticism trailing that disaster revealed that a significant number of pilots held their reservations about the 737 Max’s MCAS anti-stalling system. Boeing at the time, however,rebuffed these claims, a mistake it wouldn’t have made if it was actively tracking and properlyassessing sensor data from its in-service planes.

Thisis the same mistake a significant number of organizations, some with a statutory liability comparable in extent to Boeing, continue to make daily. Forthem,analytics and data governance remainan accessory factor of production, worth the effort only when there is a need to scale up performance/productivity or in some cases backtrack to spot reported vulnerabilities. As we see from the Boeing scenario by adopting this improvident approach, companies set themselves up for potentially damming consequences that could have been easily averted if their approach to data and analytics, in general, was more proactive.

Data governance and quality assurance must be treated as a rate-limiting factor

Not just as an accessory means of bolsteringproductivity. To do this businesses and organizations must set up and imbibe a functional analytics and data governance model, one that leverages today’s arsenal of innovative data management tools such as data warehouse catalogs (Alteryx Connect, Alation), data integration tools that improve upon MDM (Matillion, Talend, Databricks), and the business intelligence tools that have monitoring capabilities (Tableau, MicroStrategy), to propagate seismic improvements in efficiency and organizational redundancy. The ultimate goal in this system, very unlike predecessor approaches, would be to integrate data analytics, data governance and all components of data competency into the standing organizational framework of a company.

The reality created thereof for such organizations is one where they by virtue oftheir analytics framework stay on top of the game and figuratively possess an eye in the future. Sometimes this is all that’s needed to prevent the occurrence of potentially impactful social and economic catastrophes.

Click Here to see and read our data stories! Afterwards, ping us today to explore what a more proactive approach to data and analytics could look like at your organization!