GoPuff - Philadelphia, PA

posted about 2 months ago

Full-time - Mid Level
Philadelphia, PA
Couriers and Messengers

About the position

Gopuff is seeking a Data Platform Engineering Manager to lead its Data Platform team, playing a crucial role in shaping the technical direction of the team that enables analytics, data science, and machine learning at scale. This position combines hands-on engineering with people management, requiring strong data and cloud engineering expertise to influence Gopuff-wide data platforms and architecture.

Responsibilities

  • Lead and mentor a team of Data Platform Engineers to develop Gopuff's central data platform, fostering a collaborative and innovative team culture.
  • Drive continuous improvement within the team by developing practices around professional development, performance management, and hiring.
  • Provide clear technical direction and support for the team, ensuring the successful execution of data platform projects and initiatives.
  • Collaborate with engineering and product leadership to translate business requirements into effective technical solutions.
  • Develop and promote best practices for data collection, storage, and processing that impact the company-wide data strategy across Gopuff's data lake and warehouse.
  • Architect and implement large-scale data platforms that enable analytics, data science, and machine learning in a multi-cloud environment.
  • Partner with software and analytics engineering teams to establish data contracts and improve data quality at every stage of the data lifecycle.
  • Manage project timelines, resource allocation, and team performance, ensuring that deliverables are met on schedule and within scope.

Requirements

  • 5+ years of experience in a data engineering or cloud/infrastructure engineering role.
  • 3+ years of engineering management experience.
  • Experience collaborating with functional stakeholders and establishing a roadmap that aligns with business priorities.
  • Experience with organizational design, team development, hiring, and performance management.
  • Experience building batch data pipelines using DAG-based tools such as Dagster or Airflow.
  • Experience deploying applications and services to Kubernetes and using related tools in the Kubernetes ecosystem (i.e. Helm, ArgoCD, Istio).
  • Experience implementing DevOps best practices within the data platform, including solutions for CI/CD, data observability, monitoring, and lineage.
  • Experience in producing and consuming topics to/from Apache Kafka, AWS Kinesis, or Azure Event Hubs.
  • Experience with Infrastructure as code tools such as Terraform.
  • Experience developing real-time data pipelines using frameworks such as Apache Beam, Flink, Storm, Spark Streaming, etc.
  • Experience with data warehouses, data lakes, and their underlying infrastructure.
  • Proficiency in Python, SQL, RESTful API development.
  • Experience with cloud computing platforms such as Azure, AWS.
  • Experience with data governance, schema design, and schema evolution.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service