Product
-
February 21, 2024

How Zoom Prevents Three Outages A Month With Bigeye

By adopting Bigeye, Zoom has transformed its data operations, moving from reactive problem-solving to proactive data quality assurance.

Adrianna Vidal

In the fast-paced world of virtual collaboration, Zoom has emerged as a household name, facilitating remote work meetings, social gatherings, and online classes. With this rapid growth came the challenge of ensuring data quality and pipeline observability to meet evolving business needs.

The Challenge: Scaling Data Systems for Growth

Zoom's data team operates under a hub and spoke model, supporting various business units like product intelligence, marketing, and customer service. As the company grew, they faced challenges in tracking data quality and gaining unified insight into the operational health of their data pipelines.

The team implemented custom SQL checks to validate data, but this approach required them to address data outages and quality issues on a case-by-case basis. Simple issues could be resolved quickly, but complex issues often took weeks to identify and address, leading to work disruptions and frustration for business users.

The Solution: Proactive Data Quality Assurance with Bigeye

Zoom adopted Bigeye to build holistic data quality testing and pipeline observability into their data product workflows, shifting from reactive firefighting to proactive data quality assurance. They leveraged Bigeye's REST API and Airflow operator to track data ingestion job success with built-in freshness and volume monitoring.

Using Bigconfig, Bigeye's YAML-based data monitoring as code solution, the team automatically applied data quality checks to critical tables. Dynamic tagging allowed for the automatic application of appropriate data quality checks to new tables as they came online.

The Results: Improved Data Fidelity and Time Savings

Since implementing Bigeye, the Zoom data team has proactively identified and resolved 2–3 data pipeline issues a month, reducing backfill work and creating higher data fidelity. This has resulted in significant time savings for the data engineering teams and more reliable data for analysts and decision-makers.

Moreover, the team can now lay down comprehensive data quality monitoring on their most critical tables as part of their engineering workflow, streamlining data quality assurance at enterprise scale.

By adopting Bigeye, Zoom has transformed its data operations, moving from reactive problem-solving to proactive data quality assurance. With enhanced data fidelity and streamlined processes, Zoom can continue to power the world of virtual collaboration effectively.

As Tina Chang, a data engineer for the Zoom product intelligence group, puts it, "Before Bigeye, my stakeholders would ping me every one to two weeks to let me know they needed assistance. Now, I get immediate insight when something is wrong so I can proactively notify downstream users and correct it before it causes a problem."

Learn why Bigeye is the data observability platform of choice for Fortune 500 data teams

Come stop by booth #416 at Gartner Data & Analytics Summit to see a demo of how Bigeye can monitor and track lineage across your entire enterprise data environment—including across cloud and traditional environments, through ETL systems, and into your critical BI and analytics dashboards.

Book a time for a demo or meet with our team.

share this episode
Resource
Monthly cost ($)
Number of resources
Time (months)
Total cost ($)
Software/Data engineer
$15,000
3
12
$540,000
Data analyst
$12,000
2
6
$144,000
Business analyst
$10,000
1
3
$30,000
Data/product manager
$20,000
2
6
$240,000
Total cost
$954,000
Role
Goals
Common needs
Data engineers
Overall data flow. Data is fresh and operating at full volume. Jobs are always running, so data outages don't impact downstream systems.
Freshness + volume
Monitoring
Schema change detection
Lineage monitoring
Data scientists
Specific datasets in great detail. Looking for outliers, duplication, and other—sometimes subtle—issues that could affect their analysis or machine learning models.
Freshness monitoringCompleteness monitoringDuplicate detectionOutlier detectionDistribution shift detectionDimensional slicing and dicing
Analytics engineers
Rapidly testing the changes they’re making within the data model. Move fast and not break things—without spending hours writing tons of pipeline tests.
Lineage monitoringETL blue/green testing
Business intelligence analysts
The business impact of data. Understand where they should spend their time digging in, and when they have a red herring caused by a data pipeline problem.
Integration with analytics toolsAnomaly detectionCustom business metricsDimensional slicing and dicing
Other stakeholders
Data reliability. Customers and stakeholders don’t want data issues to bog them down, delay deadlines, or provide inaccurate information.
Integration with analytics toolsReporting and insights

Join the Bigeye Newsletter

1x per month. Get the latest in data observability right in your inbox.