Sr. GCP Data Engineer (Onsite)

$110,000 - $135,000/Yr

Cognizant Technology Solutions - Dearborn, MI

posted about 1 month ago

Full-time - Mid Level
Dearborn, MI
10,001+ employees
Professional, Scientific, and Technical Services

About the position

The Sr. GCP Data Engineer will be responsible for designing and deploying a Data Centric Architecture in Google Cloud Platform (GCP) for the Materials Management platform. This role involves integrating data from various applications across Product Development, Manufacturing, Finance, Purchasing, and Supply Chain, ensuring efficient data processing and management.

Responsibilities

  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools.
  • Build ETL pipelines to ingest data from heterogeneous sources into the system.
  • Develop data processing pipelines using programming languages like Java and Python.
  • Create and maintain data models for efficient storage and retrieval of large datasets.
  • Deploy and manage SQL and NoSQL databases based on project requirements.
  • Optimize data workflows for performance, reliability, and cost-effectiveness on GCP.
  • Implement version control and CI/CD practices for data engineering workflows.
  • Utilize GCP monitoring and logging tools to identify and address performance bottlenecks.
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Implement security measures and data governance policies.
  • Collaborate with stakeholders to gather and define data requirements.
  • Develop and maintain documentation for data engineering processes.
  • Participate in on-call rotations to address critical issues.
  • Provide mentorship and guidance to junior team members.

Requirements

  • 8 years of professional experience in data engineering and software product launches.
  • Experience with at least three programming languages: Java, Python, Spark, Scala, SQL.
  • 4 years of cloud data/software engineering experience building scalable data pipelines.
  • Experience with data warehouses like Google BigQuery and workflow orchestration tools like Airflow.
  • Knowledge of relational databases like MySQL, PostgreSQL, and SQL Server.
  • Experience with real-time data streaming platforms like Apache Kafka and GCP Pub/Sub.
  • Familiarity with microservices architecture and REST APIs.
  • Experience with DevOps tools such as GitHub, Terraform, and Docker.
  • Knowledge of project management tools like Atlassian JIRA.

Nice-to-haves

  • Automotive experience is preferred.
  • Support in an onshore/offshore model is preferred.
  • Excellent problem-solving skills.
  • Knowledge and practical experience of agile delivery.

Benefits

  • 401(k)
  • Dental insurance
  • Disability insurance
  • Employee stock purchase plan
  • Health insurance
  • Life insurance
  • Paid holidays
  • Paid Time Off
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service