Staff Data Engineer

DutchieRemote, OR
432d$190,000 - $190,000

About The Position

We are seeking an experienced and visionary Staff Data Engineer to play a pivotal role in shaping our data strategy, architecture, and infrastructure. This individual will serve as a technical and thought leader, leveraging deep expertise to design and implement cutting-edge data solutions. The ideal candidate will possess extensive experience with modern data engineering tools and platforms, databases, cloud technologies, and observability systems. This role not only involves building robust data systems but also driving innovation to enhance customer experiences and power smarter business decisions.

Requirements

  • 8+ years of hands-on experience in data engineering or a related field.
  • Expertise in modern data tools and platforms, including Snowflake, Fivetran, and Dagster.
  • Strong proficiency with database technologies: SQL Server, PostgreSQL, MongoDB, and AWS RDS.
  • Advanced knowledge of AWS cloud services, including data-centric solutions.
  • Proficiency in Infrastructure-as-Code (e.g., Pulumi) and container orchestration tools like Kubernetes.
  • Extensive experience in data modeling, schema design, and database optimization.
  • Familiarity with observability tools such as Datadog, Grafana, or Prometheus.
  • Proficiency in programming languages such as Python or Scala for data engineering tasks and C# or Ruby for application development.

Nice To Haves

  • Experience with additional cloud platforms such as Azure or GCP.
  • Knowledge of distributed systems and big data technologies.
  • Familiarity with CI/CD pipelines and version control systems.
  • Experience leading small-medium sized development teams.

Responsibilities

  • Lead the design and implementation of scalable, reliable, and secure data architectures and pipelines.
  • Establish best practices and frameworks for data engineering, ensuring performance, scalability, and maintainability.
  • Mentor and collaborate with team members, fostering growth and innovation.
  • Build and optimize ETL/ELT pipelines using tools such as Fivetran and Dagster.
  • Architect, deploy, and manage data warehouses and lakes, with a focus on Snowflake.
  • Leverage Infrastructure-as-Code (e.g., Pulumi) to automate the provisioning and management of resources.
  • Deploy and run data services in Kubernetes for scalability and efficiency.
  • Design and maintain data models across various database technologies, including SQL Server, PostgreSQL, MongoDB, and AWS RDS.
  • Develop advanced data models to support business intelligence and analytics needs.
  • Optimize database performance and ensure robust data governance practices.
  • Design and implement cloud-native solutions in AWS.
  • Optimize cost, performance, and scalability of cloud infrastructure.
  • Set up and maintain observability platforms (e.g., Datadog) to monitor data workflows, system health, and performance metrics.
  • Establish robust logging, alerting, and dashboard systems for proactive issue resolution.
  • Stay ahead of industry trends, introducing new tools and methodologies to improve data engineering practices.
  • Collaborate with stakeholders to align technical strategies with business objectives.

Benefits

  • Full medical benefits including dental and vision plans.
  • Equity packages in the form of stock options to all employees.
  • Technology allowance for hardware, software, and reading materials.
  • Flexible vacation and sick days.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service