ZoomInfo Technologies - Vancouver, WA

posted 2 months ago

Full-time - Mid Level
Vancouver, WA
10,001+ employees
Computing Infrastructure Providers, Data Processing, Web Hosting, and Related Services

About the position

The Data Engineer II role at ZoomInfo involves enhancing and architecting data monitoring and quality assurance systems, as well as constructing ETL processes and engineering decision-making within big data pipelines. The position requires collaboration with a dedicated team to design and implement data pipelines that ensure high-quality data for customers and internal stakeholders. This role combines development and technical leadership responsibilities, offering exposure to cloud-native technologies and significant initiatives on an enterprise scale.

Responsibilities

  • Design and implement ETL processes to handle the ingestion of data, considering downstream impacts.
  • Design and implement data monitoring pipelines to proactively identify and resolve data quality issues.
  • Collaborate with stakeholders to define project requirements and develop metrics representing data pipeline business value.
  • Innovate and develop new methodologies to enhance access to trustworthy data.
  • Own monitoring efforts for projects from dashboard wireframe to finished pipeline.

Requirements

  • Bachelor's Degree in Computer Science, Engineering, or a related STEM field, with a focus on Data Processing or Data Analysis.
  • At least 3 years of experience in Data Engineering or a similar role, with a proven track record of working with big data pipelines and analytics.
  • Minimum of 2 years of hands-on experience with SQL in scalable data warehouses (e.g., Bigquery, Snowflake).
  • Experience in implementing best practices for data engineering and optimizing data pipeline performance/scalability.
  • Proficiency in cloud technologies, preferably Google Cloud Platform and/or AWS.
  • Expertise with Apache Airflow.
  • 2+ years of experience with Apache Spark.
  • 3+ years of coding experience in Python.
  • Understanding of Distributed Systems and Effective Data Management.
  • Knowledge of commonly utilized data structures in databases and data warehouses.

Nice-to-haves

  • Familiarity with Agile methodologies and CI/CD practices.
  • Familiarity with Infrastructure as Code (e.g., Terraform).
  • Proficiency in data visualization tools (e.g., Tableau).

Benefits

  • Comprehensive benefits package
  • Holistic mind, body and lifestyle programs designed for overall well-being
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service