Data Platform Engineering Manager

$160,000 - $180,000/Yr

GoPuff - New York, NY

posted about 1 month ago

Full-time - Senior
Remote - New York, NY
1-10 employees
Couriers and Messengers

About the position

Gopuff is seeking a Data Platform Engineering Manager to lead its Data Platform team. This role involves shaping the technical direction of the team, enabling analytics, data science, and machine learning at scale. The ideal candidate will possess strong data and cloud engineering expertise, combining hands-on engineering with people management. This position is remote for NYC candidates, with the option to go onsite if desired.

Responsibilities

  • Lead and mentor a team of Data Platform Engineers to develop Gopuff's central data platform.
  • Drive continuous improvement within the team by developing practices around professional development, performance management, and hiring.
  • Provide clear technical direction and support for the team, ensuring the successful execution of data platform projects and initiatives.
  • Collaborate with engineering and product leadership to translate business requirements into effective technical solutions.
  • Develop and promote best practices for data collection, storage, and processing that impact the company-wide data strategy across Gopuff's data lake and warehouse.
  • Architect and implement large-scale data platforms that enable analytics, data science, and machine learning in a multi-cloud environment.
  • Partner with software and analytics engineering teams to establish data contracts and improve data quality at every stage of the data lifecycle.
  • Manage project timelines, resource allocation, and team performance, ensuring that deliverables are met on schedule and within scope.

Requirements

  • 5+ years of experience in a data engineering or cloud/infrastructure engineering role
  • 3+ years of engineering management experience
  • Experience collaborating with functional stakeholders and establishing a roadmap that aligns with business priorities
  • Experience with organizational design, team development, hiring, and performance management
  • Experience building batch data pipelines using DAG-based tools such as Dagster or Airflow
  • Experience deploying applications and services to Kubernetes and using related tools in the Kubernetes ecosystem (i.e. Helm, ArgoCD, Istio)
  • Experience implementing DevOps best practices within the data platform, including solutions for CI/CD, data observability, monitoring, and lineage
  • Experience in producing and consuming topics to/from Apache Kafka, AWS Kinesis, or Azure Event Hubs
  • Experience with Infrastructure as code tools such as Terraform
  • Experience developing real-time data pipelines using frameworks such as Apache Beam, Flink, Storm, Spark Streaming, etc.
  • Experience with data warehouses, data lakes, and their underlying infrastructure
  • Proficiency in Python, SQL, RESTful API development
  • Experience with cloud computing platforms such as Azure, AWS
  • Experience with data governance, schema design, and schema evolution
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service