GCP Data Platform Architect

$131,100 - $302,400/Yr

Accenture - New York, NY

posted about 1 month ago

Full-time - Senior
New York, NY
Professional, Scientific, and Technical Services

About the position

The GCP Data Platform Architect at Accenture is responsible for designing, implementing, and managing scalable and secure data solutions on the Google Cloud Platform (GCP). This role involves collaborating with various teams to understand data requirements and translating them into robust architectural blueprints, ensuring optimal performance and security of data solutions. The architect will lead technology innovation for clients, focusing on data warehousing, big data technologies, and data security best practices.

Responsibilities

  • Design and implement end-to-end data solutions on GCP, encompassing data ingestion, storage, processing, transformation, and analytics.
  • Architect, design, and deploy data warehouses and data lakes using technologies like BigQuery, Dataflow, and Dataproc.
  • Design and implement big data solutions on GCP, including data pipelines, streaming analytics, and machine learning workflows.
  • Establish and maintain robust data security frameworks, implement access controls, and ensure data governance practices.
  • Monitor, analyze, and optimize data platform performance to ensure optimal efficiency and cost-effectiveness.
  • Stay updated on the latest GCP data technologies, evaluating and recommending their adoption within the organization.
  • Work collaboratively with data engineers, data scientists, business analysts, and other stakeholders to understand requirements and deliver optimal solutions.
  • Develop clear and comprehensive documentation, including architectural diagrams, design specifications, and operational guidelines.

Requirements

  • Minimum of 3+ years of professional experience with GCP data services, including BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and related technologies.
  • Minimum of 6+ years of strong proficiency in data warehousing concepts, data modeling, and ETL/ELT processes.
  • Minimum of 6+ years of professional experience with big data technologies such as Hadoop, Spark, and NoSQL databases.
  • Deep understanding of data security principles and best practices on GCP with minimum 3+ years of professional experience.
  • Minimum of 6+ years of technical solutions implementation, architecture design, evaluation, and investigation in a cloud environment.

Nice-to-haves

  • Experience working in GCP cloud - Organization policies, IAM, VM, DB, Kubernetes & Containers.
  • Proficiency in using Google Cloud's Vertex AI platform for building, deploying, and managing machine learning models, including GenAI models.
  • Experience with Generative AI Studio for prototyping and experimenting with generative AI models.
  • Familiarity with Google's Model Garden and its offerings for accessing and deploying pre-trained GenAI models.
  • Experience in implementing MLOps practices for the development, deployment, and monitoring of GenAI models.
  • Excellent analytical and problem-solving skills.
  • Strong communication and interpersonal skills, capable of collaborating effectively with various teams.
  • GCP Professional Data Engineer or equivalent certifications are highly desirable.

Benefits

  • Competitive salary based on experience and location
  • Diversity and inclusion initiatives
  • Professional development opportunities
  • Flexible work arrangements
  • Health and wellness programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service