Sr. Manager Data Engineering

$135,000 - $227,500/Yr

PepsiCo - Chicago, IL

posted 4 months ago

Full-time - Manager
Chicago, IL
Beverage and Tobacco Product Manufacturing

About the position

PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences, and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo's global business scale to enable business insights, advanced analytics, and new product development. PepsiCo's Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. The Data Management and Operations team maintains a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. They are responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset. The team works cross-functionally across the enterprise to centralize data and standardize it for use by business, data science, or other stakeholders, increasing awareness about available data and democratizing access to it across the company. As a data engineering manager, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. You will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners, and business users in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems.

Responsibilities

  • Provide leadership and management to a team of data engineers, managing processes and their flow of work, vetting their designs, and mentoring them to realize their full potential.
  • Engage with team members, using informal and structured approaches to career development to focus on individual improvement/capabilities, and to provide balanced feedback.
  • Act as a subject matter expert across different digital projects.
  • Oversee work with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers.
  • Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products.
  • Build and own the automation and monitoring frameworks that capture metrics and operational KPIs for data pipeline quality and performance.
  • Responsible for implementing best practices around systems integration, security, performance, and data management.
  • Empower the business by creating value through the increased adoption of data, data science, and business intelligence landscape.
  • Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions.
  • Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners.
  • Develop and optimize procedures to 'productionalize' data science models.
  • Define and manage SLAs for data products and processes running in production.
  • Support large-scale experimentation done by data scientists.
  • Prototype new approaches and build solutions at scale.
  • Research in state-of-the-art methodologies.
  • Create documentation for learnings and knowledge transfer.
  • Create and audit reusable packages or libraries.

Requirements

  • 10+ years of overall technology experience that includes at least 8+ years of hands-on software development, data engineering, and systems architecture.
  • 8+ years of experience with Salesforce Cloud Technologies is a must.
  • 8+ years of experience with Salesforce Customer data modeling, data warehousing, and building high-volume ETL/ELT pipelines.
  • 8+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools.
  • 8+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala, etc.
  • 5+ years in cloud data engineering experience in at least one cloud (Azure, AWS, Google Cloud Platform).
  • Fluent with Azure cloud services. Azure Certification is a plus.
  • Experience with Salesforce Data Cloud or another CDP software.
  • Experience scaling and managing a team of 5+ engineers.
  • Experience with integration of multi-cloud services with on-premises technologies.
  • Experience with data quality and data profiling tools like Apache Griffin, Deequ, and Great Expectations.
  • Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets.
  • Experience with at least one MPP database technology such as Redshift, Synapse, or SnowFlake.
  • Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes.
  • Experience with version control systems like Github and deployment & CI tools.
  • Experience with Azure Data Factory, Databricks, and Mlflow is a plus.
  • Experience with Statistical/ML techniques is a plus.
  • Experience with building solutions in the retail or in the supply chain space is a plus.
  • Understanding of metadata management, data lineage, and data glossaries is a plus.
  • Working knowledge of agile development, including DevOps and DataOps concepts.
  • Familiarity with business intelligence tools (such as PowerBI).
  • BA/BS in Computer Science, Math, Physics, or other technical fields.

Nice-to-haves

  • Salesforce Data Cloud Accreditation
  • Relevant Salesforce certifications and consulting experience are strongly recommended
  • Familiarity with Data Regulation.

Benefits

  • Medical, Dental, Vision, Disability, Health and Dependent Care Reimbursement Accounts, Employee Assistance Program (EAP), Insurance (Accident, Group Legal, Life), Defined Contribution Retirement Plan.
  • Paid time off subject to eligibility, including paid parental leave, vacation, sick, and bereavement.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service