This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

Vista Equity Partnersposted 3 months ago
Mid Level
Austin, TX
Securities, Commodity Contracts, and Other Financial Investments and Related Activities
Resume Match Score

About the position

DataOps Engineers help to build and manage our data pipeline operations, ensuring smooth, efficient, and high-quality data flow across our platforms. This role emphasizes data pipeline management, data quality testing, and the optimization of data delivery. As part of our growing data team, you'll play a pivotal role in ensuring data is processed, validated, and made available efficiently and reliably for business and analytical use.

Responsibilities

  • Oversee the development, maintenance, and continuous improvement of data pipelines to ensure timely and reliable delivery of data across systems.
  • Implement and maintain automated data validation and quality testing frameworks to ensure accuracy, consistency, and reliability of data.
  • Monitor and track pipeline performance, proactively identifying and resolving bottlenecks or failures to maintain optimal data flow.
  • Collaborate with data and software engineers to integrate new data sources and pipelines, ensuring they meet business and technical requirements.
  • Implement data profiling tools and techniques to continuously assess data quality, detect anomalies, and ensure data adheres to predefined quality standards.
  • Automate routine tasks related to data ingestion, transformation, and delivery using appropriate tools and technologies, minimizing manual intervention.
  • Partner with data engineers, analysts, and business teams to ensure that data is accessible, accurate, and up to date for decision-making and reporting needs.
  • Maintain thorough documentation for data pipelines, data flows, data quality checks, and operational processes.
  • Ensure data governance policies are enforced across data pipelines, including data lineage tracking, auditability, and access control.
  • Evaluate and recommend tools and practices to improve the reliability, scalability, and performance of data operations.

Requirements

  • 3+ years of experience in DataOps, Data Engineering, or Data Management roles, with a strong focus on data pipeline management and data quality.
  • Proficiency with data orchestration and ETL/ELT tools such as Apache Airflow, Azure Data Factory, dbt, Fivetran, or similar to manage complex data pipelines.
  • Experience with data quality testing and monitoring tools (e.g., Great Expectations, dbt tests, or custom validation frameworks).
  • Knowledge of ETL/ELT processes and the ability to implement scalable data transformation pipelines.
  • Strong proficiency in SQL for querying and transforming data, as well as scripting languages like Python for automating data operations.
  • Familiarity with cloud platforms (AWS, Azure, or Google Cloud) for managing cloud-based data infrastructure and pipelines.
  • Experience implementing data governance frameworks, including data lineage, metadata management, and compliance with regulatory standards.
  • Knowledge of managing data in cloud-based data warehouses such as Snowflake, Redshift, or BigQuery.

Nice-to-haves

  • Experience in automating data validation, pipeline management, and delivery processes to ensure efficient and error-free operations.
  • Familiarity with big data technologies (e.g., Spark, Kafka) and real-time data streaming tools.
  • Experience integrating a variety of data sources (APIs, databases, flat files) into data systems.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service