By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
Back to customer stories
Overview

Use cases

Data ingestion, data quality monitoring

outcome

2-3

outages caught and remediated p/ month

tech stack

Learn how Bigeye is helping Zoom achieve their vision of delivering trusted data products to business decision makers.

Powering the world of virtual collaboration.

Since 2020, Zoom has become ubiquitous with virtual meetings and collaboration, covering a wide range of human interactions that include remote work meetings, social and community events, and online classes.

The Zoom data team operates under a hub and spoke model, with a horizontal data platform team supporting the core data stack and infrastructure and various vertical data teams supporting business units like product intelligence, marketing, customer service, and others.

Challenge

As with any company in a high-growth phase, Zoom’s data systems and processes were scaled up quickly to meet rapidly evolving business needs. While existing ETL tools helped ensure they could get data into the hands of their business stakeholders quickly, they still needed a solution for tracking data quality or getting unified insight into the operational health of their data pipelines.

To address this, Zoom data teams implemented various custom SQL checks to validate data. However, this solution required the data team to root cause and remediate data outages and quality issues on a case-by-case basis.  Simple issues could sometimes be resolved in a day, but complex issues would often take several weeks to identify and address. Tina Chang, a data engineer for the Zoom product intelligence group, noted that it was particularly painful when the team discovered that an ingestion job had failed several days ago, as this required the team to backfill the data by reproducing a complex workflow of daily ingestion processes and steps.  

Though Tina’s team was able to resolve these issues manually, they could create significant work disruptions for the data teams and frustration for business users. The team was spending far more resources than they wanted to in order to simply ensure that business users trusted, understood, and could rely on the requested analytics.

Solution

As a product-led company, Zoom approaches data as products instead of as a commodity. They selected Bigeye to help build holistic data quality testing and pipeline observability into their data product workflows and shift from reactive firefighting to proactive data quality assurance.

For the horizontal data platforms team, this includes using the Bigeye REST API and Airflow operator to track the success of data ingestion jobs with built-in freshness and volume monitoring.

Once the data has been ingested, the data teams use Bigconfig, Bigeye’s YAML-based data monitoring as code solution, to automatically apply data quality checks to critical tables. Bigconfig includes dynamic tagging, which allows Bigeye to automatically apply the appropriate data quality checks—called Autometrics—to new tables as they come online.

Beyond out-of-the-box data pipeline and quality metrics, the Zoom team also takes advantage of Bigeye metric templates to easily express complex business-logic checks in SQL. This allows them to automate more complex tasks like foreign key verification and still get the other benefits Bigeye has to offer.

Results

Since implementing Bigeye, the Zoom data team has proactively identified and resolved 2–3 data pipeline issues a month, creating higher data fidelity and reducing backfill work.  This has resulted in significant time savings for the Zoom data engineering teams and more reliable data for Zoom analysts and decision makers.

In addition, the team can now automatically lay down comprehensive data quality monitoring on their most critical tables as part of their engineering workflow. This allows them to streamline data quality assurance at enterprise scale by integrating it into their processes as code.

quotation mark
As a data engineer, I love that I can use the Bigeye Airflow operator to ensure ingested data loads correctly. Before Bigeye, my stakeholders would ping me every one to two weeks to let me know they needed assistance. Now, I get immediate insight when something is wrong so I can proactively notify downstream users and correct it before it causes a problem.
Tina Chang
Data Engineer

share this case study

Start your data observability journey

Get a demo