Unclassified - New York, NY

posted about 1 month ago

Full-time - Mid Level
New York, NY

About the position

As a GCP Data Engineer, you will lead technology innovation for clients by delivering world-class data solutions. This role involves integrating native GCP services and third-party technologies to architect scalable data warehouses, data lakes, and analytics platforms. You will manage the entire data lifecycle from ingestion to visualization in complex client environments, ensuring the successful deployment of large-scale data solutions.

Responsibilities

  • Lead a team in designing, developing, and deploying high-performance data analytics solutions.
  • Provide technical expertise from concept to operations, ensuring the successful deployment of large-scale data solutions.
  • Build secure and reliable data-centric services in GCP.
  • Implement end-to-end data analytics for complex environments, including data ingestion, transformation, and visualization.
  • Provide thought leadership on Big Data and analytic strategies for clients.
  • Support data migration and transformation projects, leveraging Google AutoML to enhance pipeline intelligence.

Requirements

  • 3+ years of experience with GCP data engineering, ingestion, and curation.
  • 3+ years of experience designing data models on GCP using BigQuery and BigTable.
  • 1+ years of experience building and managing machine learning models with Vertex AI.
  • 1+ years of implementing MLOps for GenAI model deployment.
  • Extensive experience in large-scale architecture, solution design, and operationalization of data warehouses, data lakes, and analytics platforms on GCP.
  • Strong knowledge of GCP services, with at least 5 years in cloud platforms and 2+ years of deep experience in GCP data services (e.g., Spark, DataProc, Dataflow, BigQuery, Pub/Sub).
  • 3+ years of experience re-architecting data warehouses on GCP, designing and building production data pipelines using Java and Python.
  • Hands-on experience with GCP data lakes and ingestion solutions.
  • Experience with metadata management, Hadoop/NoSQL, performance engineering, and self-service data preparation tools like Trifacta or Paxata.
  • Bachelor's degree or equivalent work experience.
  • Google Certified Professional Data Engineer certification or Google Professional Machine Learning Engineer certification required.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service