GCP Data Engineer

$93,400 - $224,600/Yr

Accenture - Carmel, IN

posted about 1 month ago

Full-time - Mid Level
Carmel, IN
Professional, Scientific, and Technical Services

About the position

The GCP Data Engineer at Accenture will lead technology innovation by architecting and modernizing enterprise data solutions on Google Cloud Platform (GCP). This role involves collaborating with a team to deliver high-performance data analytics solutions, utilizing a broad set of technology skills to design effective solutions that integrate GCP services and third-party technologies. The position requires a strong understanding of data engineering, analytics, and cloud technologies, with a focus on transforming client initiatives in a dynamic environment.

Responsibilities

  • Lead a team of data engineers in designing, developing, testing, and deploying high-performance data analytics solutions in GCP.
  • Work with implementation teams from concept to operations, providing deep technical expertise for deploying large-scale data solutions.
  • Implement end-to-end data analytics solutions for large-scale, complex client environments.
  • Analyze and understand Big Data and analytical technologies on GCP, providing thought leadership to clients.
  • Communicate complex technical topics to non-technical business and senior executives.
  • Support data migration and transformation projects.

Requirements

  • Minimum of 5 years of experience in any cloud platform, including 2 years of deep experience with GCP data and analytics services.
  • Minimum of 3 years of proven ability to re-architect and re-platform on-premise data warehouses to GCP.
  • Minimum of 3 years of expertise in architecting and implementing data and analytics platforms on GCP.
  • Minimum of 3 years of hands-on experience architecting and designing data lakes on GCP.
  • Minimum of 3 years of proficiency in designing and optimizing data models on GCP using BigQuery and BigTable.
  • Minimum of 2 years of proficiency in using Google Cloud's Vertex AI platform for building and managing machine learning models.

Nice-to-haves

  • Experience architecting and implementing metadata management, data governance, and security for data platforms on GCP.
  • Ability to design operations architecture and conduct performance engineering for large-scale data lakes.
  • Familiarity with introducing and operationalizing self-service data preparation tools on GCP.
  • Experience writing complex SQL queries and stored procedures.
  • Experience with Generative AI Studio for prototyping generative AI models.

Benefits

  • Diversity and inclusion programs
  • Professional development opportunities
  • Flexible work arrangements
  • Health insurance
  • 401k plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service