GCP Data Engineer

$93,400 - $224,600/Yr

Accenture - Phoenix, AZ

posted about 1 month ago

Full-time - Mid Level
Phoenix, AZ
Professional, Scientific, and Technical Services

About the position

The GCP Data Engineer at Accenture is responsible for architecting and modernizing enterprise data solutions on Google Cloud Platform (GCP). This role involves leading a team to design, develop, and deploy high-performance data analytics solutions, while providing technical expertise throughout the implementation process. The position requires a solid understanding of large-scale data architecture and operationalization, as well as the ability to communicate complex technical topics to non-technical stakeholders. The GCP Data Engineer will work on various client initiatives, leveraging a broad set of technology skills to deliver innovative solutions.

Responsibilities

  • Lead a team of data engineers in designing, developing, testing, and deploying high performance data analytics solutions in GCP.
  • Work with implementation teams from concept to operations, providing deep technical subject matter expertise for deploying large scale data solutions.
  • Build solution architecture, provision infrastructure, and secure reliable data-centric services in GCP.
  • Implement end-to-end data analytics solutions for large-scale, complex client environments.
  • Analyze Big Data and analytical technologies on GCP and provide thought leadership to clients.
  • Communicate complex technical topics to non-technical business and senior executives.
  • Support data migration and transformation projects.
  • Utilize Google AutoML framework to build intelligence into data pipelines.

Requirements

  • Minimum of 5 years experience in any cloud platform including 2 years of deep experience with GCP data and analytics services.
  • Minimum of 3 years proven ability to re-architect and re-platform on-premise data warehouses to GCP.
  • Minimum of 3 years expertise in architecting and implementing next-generation data and analytics platforms on GCP.
  • Minimum of 3 years hands-on experience architecting & designing data lakes on GCP.
  • Minimum of 3 years proficiency in designing and optimizing data models on GCP using BigQuery and BigTable.
  • Minimum of 2 years proficiency in using Google Cloud's Vertex AI platform for machine learning models.
  • Minimum of 2 years experience in implementing MLOps practices for GenAI models.
  • Bachelor's degree or equivalent work experience.

Nice-to-haves

  • Experience architecting and implementing metadata management, data governance, and security for data platforms on GCP.
  • Ability to design operations architecture and conduct performance engineering for large-scale data lakes.
  • Experience architecting and operating large production Hadoop/NoSQL clusters.
  • Familiarity with self-service data preparation tools on GCP.
  • 3+ years experience writing complex SQL queries and stored procedures.
  • Experience with Generative AI Studio for prototyping generative AI models.
  • Familiarity with Google's Model Garden for accessing pre-trained GenAI models.
  • Google Certified Professional Data Engineer.

Benefits

  • Diversity and inclusion programs
  • Professional development opportunities
  • Health insurance
  • 401k plan
  • Paid time off
  • Flexible working hours
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service