Lead Data Engineer

$135,000 - $140,000/Yr

American Red Cross

posted 2 months ago

Full-time - Mid Level
Remote
Social Assistance

About the position

The American Red Cross is seeking a Lead Data Warehouse Engineer to join our Enterprise Data & Analytics Team. This role is pivotal in modernizing and transforming our data and reporting capabilities across various verticals by implementing a new, modernized data architecture and technical stack. As a Lead Data Warehouse Engineer, you will be responsible for developing, testing, maintaining, and supporting an enterprise data warehouse system, ensuring that it meets the agreed-upon standards for database design and implementation. This position requires an innovative engineer who is passionate about data quality and automation, with a strong background in data warehousing and the ability to develop and deploy scalable data pipelines. Your work will directly impact our mission to help people in need, making data management and analytics/reporting faster, more insightful, and more efficient. In this fully remote position, you will collaborate with analytics and business teams to enhance data models that feed business intelligence tools, thereby increasing data accessibility and fostering data-driven decision-making across the organization. You will build and implement scalable solutions that align with our data governance standards and architectural roadmap for data integrations, storage, reporting, and analytics solutions. Additionally, you will design, develop, and test data integration solutions, automate deployments using Git pipelines, and mentor less experienced team members through code reviews and pair programming. Your role will also involve performing data analysis to troubleshoot data-related issues and building a data quality framework to assist in resolving these issues. This is an individual contributor role that works under limited supervision, requiring subject matter knowledge and the capacity to understand specific needs or requirements to apply your skills effectively.

Responsibilities

  • Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
  • Build and implement scalable solutions that align to our data governance standards and architectural roadmap for data integrations, data storage, reporting, and analytic solutions.
  • Design, develop and test data integration solutions. Write, automate and document unit/integration/functional tests.
  • Manage automated deployments using Git pipelines, ensuring code changes are efficiently version-controlled and deployed.
  • Collaborate with multiple teams to streamline deployment workflows, monitor and troubleshoot pipelines for reliability and performance.
  • Perform data analysis required to troubleshoot data-related issues by building a data quality framework and assist in the resolution of data issues.
  • Serve as tech lead by mentoring less experienced members of the team through code reviews, pair programming and similar hands-on interactions.

Requirements

  • 4-year college degree or equivalent combination of education and experience, preferably in Computer Science, Engineering, Mathematics, or a related technical field.
  • 7+ years of relevant work experience in data engineering, business intelligence, or a related field.
  • Experience with a variety of database technologies and data warehouse schema design patterns, particularly snowflake and star schemas.
  • Experience with cloud-based databases, specifically AWS technologies (e.g., Redshift, RDS, S3, EC2, EKS, Zero ETL).
  • Experience using SQL queries and writing and optimizing SQL queries in a business environment with large-scale, complex datasets.
  • Experience creating ETL and/or ELT jobs.
  • Experience with Agile software development methodologies.
  • Excellent problem-solving and troubleshooting skills.
  • Process-oriented with great documentation skills.
  • Proficient in object-oriented programming, particularly Python.
  • Experience with DevOps methodologies and tools (e.g., Git, Artifactory, etc.).
  • Experience developing in a Linux environment.
  • Experience developing integrations across multiple systems and APIs is a plus.
  • Experience with Big Data tools like Spark, Hadoop, Kafka, etc. is a plus.

Nice-to-haves

  • Experience with data visualization tools such as Tableau or Power BI.
  • Familiarity with machine learning concepts and tools.
  • Knowledge of data governance frameworks and best practices.

Benefits

  • Medical, Dental, & Vision Plans
  • Health Spending Accounts & Flexible Spending Accounts
  • PTO + Holidays
  • 401K with up to 5% Match
  • Paid Family Leave
  • Employee Assistance Programs
  • Disability and Insurance: Short + Long Term
  • Service Awards and Recognition
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service