V2Soft - Allen Park, MI

posted 20 days ago

Full-time - Mid Level
Allen Park, MI
51-100 employees
Professional, Scientific, and Technical Services

About the position

The GCP Data Engineer at V2Soft is responsible for designing and implementing data-centric solutions on the Google Cloud Platform (GCP). This role involves building ETL pipelines, developing data processing pipelines, and optimizing data workflows to ensure performance and reliability. The engineer will collaborate with stakeholders to define data requirements and provide mentorship to junior team members, contributing to a collaborative work environment.

Responsibilities

  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools.
  • Build ETL pipelines to ingest data from heterogeneous sources into the system.
  • Develop data processing pipelines using programming languages like Java and Python.
  • Create and maintain data models for efficient storage, retrieval, and analysis of large datasets.
  • Deploy and manage SQL and NoSQL databases based on project requirements.
  • Optimize data workflows for performance, reliability, and cost-effectiveness on GCP.
  • Implement version control and CI/CD practices for data engineering workflows.
  • Utilize GCP monitoring and logging tools to identify and address performance bottlenecks.
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Address code quality issues using tools like SonarQube and Checkmarx.
  • Implement security measures and data governance policies.
  • Collaborate with stakeholders to gather and define data requirements.
  • Develop and maintain documentation for data engineering processes.
  • Participate in on-call rotations to address critical issues.
  • Provide mentorship and guidance to junior team members.

Requirements

  • 8 years of professional experience in data engineering, data product development, and software product launches.
  • Experience with at least three programming languages: Java, Python, Spark, Scala, SQL.
  • 4 years of cloud data/software engineering experience building scalable production data pipelines.
  • Experience with data warehouses like Google BigQuery and workflow orchestration tools like Airflow.
  • Knowledge of relational databases like MySQL, PostgreSQL, and SQL Server.
  • Experience with real-time data streaming platforms like Apache Kafka and GCP Pub/Sub.
  • Familiarity with microservices architecture for large-scale data processing applications.
  • Experience with REST APIs and DevOps tools such as Tekton, GitHub Actions, and Docker.

Nice-to-haves

  • Automotive experience is preferred.
  • Experience supporting in an onshore/offshore model is preferred.

Benefits

  • Health insurance
  • 401k plan
  • Paid holidays
  • Flexible scheduling
  • Professional development opportunities
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service