GCP Data Engineer

$73,000 - $220,400/Yr

Accenture - Minneapolis, MN

posted 13 days ago

Full-time - Mid Level
Minneapolis, MN
Professional, Scientific, and Technical Services

About the position

The GCP Data Engineer at Accenture will lead technology innovation by architecting and modernizing enterprise data solutions on Google Cloud Platform (GCP). This role involves collaborating with a team to deliver high-performance data analytics solutions, utilizing a broad set of technology skills to design effective solutions that integrate GCP services and third-party technologies. The position requires a strong understanding of data architecture, data lakes, and analytics platforms, with a focus on delivering innovative solutions to complex business challenges.

Responsibilities

  • Lead a team of data engineers in designing, developing, testing, and deploying high-performance data analytics solutions in GCP.
  • Work with implementation teams from concept to operations, providing technical expertise for deploying large-scale data solutions.
  • Build solution architecture, provision infrastructure, and secure data-centric services in GCP.
  • Implement end-to-end data analytics solutions for large-scale client environments.
  • Analyze Big Data and analytical technologies on GCP to assist clients with their architecture and data strategies.
  • Communicate complex technical topics to non-technical business and senior executives.
  • Support data migration and transformation projects.
  • Utilize Google AutoML framework to enhance data pipelines.

Requirements

  • Minimum of 5 years of experience in any cloud platform, including 2 years of deep experience with GCP data and analytics services.
  • Minimum of 3 years of experience in re-architecting on-premise data warehouses to GCP and building production data pipelines using Java, Python, or Scala.
  • Minimum of 3 years of expertise in architecting and implementing data and analytics platforms on GCP.
  • Minimum of 3 years of hands-on experience architecting data lakes on GCP and implementing data ingestion solutions.
  • Minimum of 3 years of proficiency in designing and optimizing data models on GCP using BigQuery and BigTable.
  • Minimum of 2 years of proficiency in using Google Cloud's Vertex AI platform for machine learning models.
  • Bachelor's degree or equivalent work experience.

Nice-to-haves

  • Experience in architecting and implementing metadata management, data governance, and security for data platforms on GCP.
  • Ability to design operations architecture and conduct performance engineering for large-scale data lakes.
  • Experience with Hadoop/NoSQL clusters on-premise or in the cloud.
  • Familiarity with self-service data preparation tools on GCP.
  • 3+ years of experience writing complex SQL queries and stored procedures.
  • Experience with Generative AI Studio for prototyping generative AI models.
  • Familiarity with Google's Model Garden for accessing pre-trained GenAI models.

Benefits

  • Diversity and inclusion initiatives
  • Equal Employment Opportunity policies
  • Accommodations for disabilities or religious observances
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service