V2Soft - Allen Park, MI

posted about 2 months ago

Full-time - Mid Level
Allen Park, MI
Professional, Scientific, and Technical Services

About the position

V2Soft is seeking a skilled Data Engineer to join our team in Allen Park, Michigan. The ideal candidate will be responsible for designing and implementing data-centric solutions on the Google Cloud Platform (GCP). This role involves utilizing various GCP tools such as Big Query, Google Cloud Storage, Cloud SQL, and more to create efficient data workflows. The Data Engineer will build ETL pipelines to ingest data from diverse sources, ensuring that data is transformed and loaded effectively into our systems. In addition to building ETL pipelines, the Data Engineer will develop data processing pipelines using programming languages like Java and Python. The role requires creating and maintaining data models that facilitate the efficient storage, retrieval, and analysis of large datasets. The successful candidate will also deploy and manage both SQL and NoSQL databases, optimizing data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. The Data Engineer will implement version control and CI/CD practices for data engineering workflows, ensuring reliable and efficient deployments. Monitoring and logging tools will be utilized to proactively identify and address performance bottlenecks and system failures. The role also involves troubleshooting and resolving issues related to data processing, storage, and retrieval, while maintaining high code quality throughout the development lifecycle. Collaboration with stakeholders is key, as the Data Engineer will gather and define data requirements to ensure alignment with business objectives. Documentation of data engineering processes is essential for knowledge transfer and ease of system maintenance. The Data Engineer will also participate in on-call rotations to address critical issues and provide mentorship to junior team members, fostering a collaborative environment.

Responsibilities

  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools.
  • Build ETL pipelines to ingest data from heterogeneous sources into our system.
  • Develop data processing pipelines using programming languages like Java and Python.
  • Create and maintain data models for efficient storage, retrieval, and analysis of large datasets.
  • Deploy and manage databases, both SQL and NoSQL, based on project requirements.
  • Optimize data workflows for performance, reliability, and cost-effectiveness on GCP infrastructure.
  • Implement version control and CI/CD practices for data engineering workflows.
  • Utilize GCP monitoring and logging tools to identify and address performance bottlenecks.
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Address code quality issues using tools like SonarQube and Checkmarx.
  • Implement security measures and data governance policies.
  • Collaborate with stakeholders to gather and define data requirements.
  • Develop and maintain documentation for data engineering processes.
  • Participate in on-call rotations to address critical issues.
  • Provide mentorship and guidance to junior team members.

Requirements

  • 8 years of professional experience in data engineering, data product development, and software product launches.
  • Experience with at least three of the following languages: Java, Python, Spark, Scala, SQL, with performance tuning experience.
  • 4 years of cloud data/software engineering experience building scalable, reliable, and cost-effective production batch and streaming data pipelines.
  • Experience with data warehouses like Google BigQuery and workflow orchestration tools like Airflow.
  • Proficiency in Relational Database Management Systems like MySQL, PostgreSQL, and SQL Server.
  • Experience with real-time data streaming platforms like Apache Kafka and GCP Pub/Sub.
  • Familiarity with microservices architecture for large-scale real-time data processing applications.
  • Experience with REST APIs for compute, storage, operations, and security.
  • Knowledge of DevOps tools such as Tekton, GitHub Actions, Git, GitHub, Terraform, and Docker.
  • Experience with project management tools like Atlassian JIRA.
  • Automotive experience is preferred.
  • Support in an onshore/offshore model is preferred.

Nice-to-haves

  • Automotive experience is preferred.
  • Support in an onshore/offshore model is preferred.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service