McDonald's - Chicago, IL

posted 2 months ago

Full-time
Chicago, IL
Food Services and Drinking Places

About the position

McDonald's Global Technology - Data & Analytics team is seeking a Data Engineer with a profound understanding of the Data Product Lifecycle, Standards, and Practices. In this role, you will be responsible for constructing scalable and efficient data solutions that support the company's data products and analytics initiatives. As a Data Engineer, you will work closely with data scientists, analysts, and other cross-functional teams to ensure the availability, reliability, and performance of data systems. Your expertise in cloud computing platforms, technologies, and data engineering best practices will be pivotal in delivering high-quality data products and enabling data-driven decision-making. You will build and maintain relevant and reliable data products that align with business needs, developing and implementing new technology solutions to enhance data reliability and observability. Your participation in new software development engineering will involve defining business rules that determine data quality, assisting the product owner in writing test scripts, and performing rigorous testing to ensure data quality. A solid understanding of the technical details of data domains will be essential, as you will need to clearly understand the business problems being solved. Your responsibilities will include designing and developing data pipelines and ETL processes to extract, transform, and load data from various sources into AWS data storage solutions such as S3, Redshift, and Glue. You will implement and maintain scalable data architectures that support efficient data storage, retrieval, and processing, while collaborating with data scientists and analysts to understand data requirements and ensure data accuracy, integrity, and availability. Additionally, you will build and optimize data integration workflows, monitor and troubleshoot data pipelines, and ensure data security and compliance with governance policies. Managing data infrastructure on AWS will be part of your role, including capacity planning, cost optimization, and resource allocation. Staying updated with emerging data engineering technologies and best practices will be crucial for improving data systems and processes. You will also document data engineering processes, workflows, and solutions for knowledge sharing and future reference, and coordinate with teams distributed across time zones as needed.

Responsibilities

  • Builds and maintains relevant and reliable data products that support the business needs.
  • Develops and implements new technology solutions as needed to ensure ongoing improvement with data reliability and observability in-view.
  • Participates in new software development engineering.
  • Helps to define business rules that determine the quality of data.
  • Assists the product owner in writing test scripts that validate business rules.
  • Performs detailed and rigorous testing to ensure data quality.
  • Develops a solid understanding of the technical details of data domains.
  • Designs and develops data pipelines and ETL processes to extract, transform, and load data from various sources into AWS data storage solutions.
  • Implements and maintains scalable data architectures that support efficient data storage, retrieval, and processing.
  • Collaborates with data scientists and analysts to understand data requirements and ensure data accuracy, integrity, and availability.
  • Builds and optimizes data integration workflows to connect data from different systems and platforms.
  • Monitors and troubleshoots data pipelines, identifying and resolving performance issues and bottlenecks.
  • Ensures data security and compliance with data governance policies and regulations.
  • Manages data infrastructure on AWS, including capacity planning, cost optimization, and resource allocation.
  • Stays up to date with emerging data engineering technologies, trends, and best practices.
  • Documents data engineering processes, workflows, and solutions for knowledge sharing and future reference.
  • Coordinates and works with teams distributed across time zones.

Requirements

  • Bachelor's or Master's degree in Computer Science or related engineering field and deep experience with AWS infrastructure.
  • 7+ years of strong experience in data engineering, preferably with AWS backend tech stack, including but not limited to S3, Redshift, Glue, Lambda, EMR, and Athena.
  • 7+ years of proficiency in programming languages commonly used in data engineering, such as Python.
  • 5+ years of hands-on experience with big data processing frameworks, such as Apache Spark.
  • 5+ years of hands-on experience with data modeling, ETL development, and data integration techniques.
  • Working knowledge of relational and dimensional data design and modeling in a large multi-platform data environment.
  • Solid understanding of SQL and database concepts.
  • Expert knowledge of quality functions like cleansing, standardization, parsing, de-duplication, mapping, hierarchy management, etc.
  • Expert knowledge of data, master data, and metadata related standards, processes, and technology.
  • Ability to drive continuous data management quality (i.e. timeliness, completeness, accuracy) through defined and governed principles.
  • Ability to perform extensive data analysis (comparing multiple datasets) using a variety of tools.
  • Demonstrated experience in data management & data governance capabilities.
  • Familiarity with data warehousing principles and best practices.
  • Excellent problem solver - use of data and technology to solve problems or answer complex data related questions.
  • Excellent communication and collaboration skills to work effectively in cross-functional teams.

Nice-to-haves

  • Experience with JIRA and Confluence as part of project workflow and documentation tools is a plus.
  • Experience with Agile project management methods and terminology a plus.
  • Experience with Prometheus, Grafana.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service