Case Study

Help our government act right during Covid 19 through trusted data

Virologists, government and law enforcement coordinate the Covid-19 crisis based on trusted data.

Challenge

Consistent communication of critical numbers

Data observability, the operational aspect of running qualitative data pipelines, is of growing importance to our client. As a key player in the healthcare policy system, they play an important role in data gathering, processing, and distribution of data describing the state of the Covid-19 propagation.

With several hundreds of data suppliers providing updates at least on a daily basis, our client faced increasingly complex data pipeline quality challenges.

To make sure all internal and external clients trust the validity of frequently updated data products, they had to consistently track key statistics for each individual dataset:

  • Data consistency for events sourced from multiple systems;
  • Data freshness for data providers who are in continuous evolution;
  • Anomaly detection in time series of reported incidents;

Multiplied by the number of datasets, and the hourly refresh rate, this was a daunting task. A structured approach to data observability was inevitable.

Approach

Orchestrate automated data quality checks.

We wanted to make sure the data engineering team would never be the bottleneck in detecting data quality issues. As such, we wanted to 

  • Enable business stakeholders to verify the overall as well as the individual status of their datasets;
  • Ensure we had a proactive alerting system in place for quality issues; thus relieving data analysts from the job of continuously validating status;

 Furthermore, by making data quality monitoring a business process, the data engineering team was relieved from a manual, repetitive task.   

In the original setting, the project had typical data tests. Most were developed using common SQL code; with a reactive management process. That is, analysts frequently had to execute SQL code and alert stakeholders when known issues were detected.

Together with our consultants, the team

  • Defined, developed, and operationalized custom metrics;
  • Deployed the automated scanning tool in a production data pipeline;
  • Setup the cloud environment on AWS to execute the scanning tasks;

Results

On-call services dropped by 100%, data has validity stamps before it is analyzed.

Soda Data project was deployed in less than 2 weeks time. 

Business stakeholders get access to data stream quality dashboards. They are able to define their own custom monitors, and could opt-in to proactive alerts.

By enabling a new monitoring process, we have been able to shift from a “do I trust this number?” mindset into an “I trust this number, unless..” mindset.

Scroll to Top