Ardent Mills - Denver, CO

posted 5 days ago

Full-time
Denver, CO
Food Manufacturing

About the position

As a Data Engineer II, you will design, develop, and maintain robust and scalable data pipelines, data architecture, and data warehouses to support business intelligence and analytics initiatives. This role involves collaboration with cross-functional teams to ensure data accessibility, accuracy, and optimization for decision-making.

Responsibilities

  • Design, develop, and optimize large-scale ETL/ELT pipelines to ingest, process, and transform data from various sources.
  • Design and maintain scalable and efficient data warehouse solutions.
  • Work closely with product managers, software architects, and business stakeholders to understand business requirements and translate them into efficient data models and solutions.
  • Optimize the performance of data warehouses and data pipelines for both batch and real-time processing, ensuring low-latency and high throughput.
  • Implement automation processes and monitoring solutions for data pipelines to ensure data integrity and operational efficiency.
  • Establish and enforce data quality, security, and governance protocols to meet compliance requirements.
  • Provide guidance and mentorship to junior data engineers, fostering a collaborative and learning-focused environment.

Requirements

  • 5+ years of experience in data engineering, with a strong focus on building and maintaining data pipelines and data warehouses.
  • Proven experience with data warehousing solutions such as Snowflake, Amazon Redshift, Azure Synapse, or Google BigQuery.
  • Significant hands-on experience using Databricks for data engineering, ETL processes, and analytics.
  • Strong experience with cloud services such as AWS, Azure, or Google Cloud, especially in managing cloud-based data infrastructure.
  • Proficiency in Python, Scala, or Java for data processing, pipeline development, and automation.
  • Expertise in writing complex SQL queries and working with large-scale relational databases.
  • Proficiency with orchestration tools such as Apache Airflow, Databricks Workflows, or equivalent.
  • Familiarity with DevOps tools and practices, including Terraform, Docker, or Kubernetes for managing data infrastructure and CI/CD pipelines.
  • Strong problem-solving and analytical skills, with a focus on delivering scalable and reliable data solutions.

Nice-to-haves

  • Experience with modern data lakehouse solutions.
  • Familiarity with Agile methodologies and working in fast-paced environments.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service