Sr. GCP Data Engineer (hybrid)

$110,000 - $135,000/Yr

Cognizant Technology Solutions - Dearborn, MI

posted about 1 month ago

Full-time - Senior
Dearborn, MI
10,001+ employees
Professional, Scientific, and Technical Services

About the position

The Sr. GCP Data Engineer will be responsible for designing and deploying a Data Centric Architecture in Google Cloud Platform (GCP) for the Materials Management platform. This role involves integrating data from various applications across Product Development, Manufacturing, Finance, Purchasing, and Supply Chain, ensuring efficient data processing and management.

Responsibilities

  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools.
  • Build ETL pipelines to ingest data from heterogeneous sources into the system.
  • Develop data processing pipelines using programming languages like Java and Python.
  • Create and maintain data models for efficient storage, retrieval, and analysis of large datasets.
  • Deploy and manage databases, both SQL and NoSQL, based on project requirements.
  • Optimize data workflows for performance, reliability, and cost-effectiveness on GCP.
  • Implement version control and CI/CD practices for data engineering workflows.
  • Utilize GCP monitoring and logging tools to identify and address performance bottlenecks.
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Address code quality issues using tools like SonarQube and Checkmarx.
  • Implement security measures and data governance policies.
  • Collaborate with stakeholders to gather and define data requirements.
  • Develop and maintain documentation for data engineering processes.
  • Participate in on-call rotations to address critical issues.
  • Provide mentorship and guidance to junior team members.

Requirements

  • 8 years of professional experience in data engineering, data product development, and software product launches.
  • Experience with at least three programming languages: Java, Python, Spark, Scala, SQL.
  • 4 years of cloud data/software engineering experience building scalable data pipelines using Google BigQuery and other tools.
  • Experience with workflow orchestration tools like Airflow.
  • Knowledge of relational databases like MySQL, PostgreSQL, and SQL Server.
  • Experience with real-time data streaming platforms like Apache Kafka and GCP Pub/Sub.
  • Familiarity with microservices architecture and REST APIs.
  • Experience with DevOps tools such as Tekton, GitHub Actions, and Docker.
  • Knowledge of project management tools like Atlassian JIRA.

Nice-to-haves

  • Automotive experience is preferred.
  • Support in an onshore/offshore model is preferred.
  • Excellent problem-solving skills.
  • Knowledge and practical experience of agile delivery.

Benefits

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 401(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service