V2Soft - Allen Park, MI
posted about 2 months ago
V2Soft is seeking a skilled Data Engineer to join our team in Allen Park, Michigan. The ideal candidate will be responsible for designing and implementing data-centric solutions on the Google Cloud Platform (GCP). This role involves utilizing various GCP tools such as Big Query, Google Cloud Storage, Cloud SQL, and more to create efficient data workflows. The Data Engineer will build ETL pipelines to ingest data from diverse sources, ensuring that data is transformed and loaded effectively into our systems. In addition to building ETL pipelines, the Data Engineer will develop data processing pipelines using programming languages like Java and Python. The role requires creating and maintaining data models that facilitate the efficient storage, retrieval, and analysis of large datasets. The successful candidate will also deploy and manage both SQL and NoSQL databases, optimizing data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. The Data Engineer will implement version control and CI/CD practices for data engineering workflows, ensuring reliable and efficient deployments. Monitoring and logging tools will be utilized to proactively identify and address performance bottlenecks and system failures. The role also involves troubleshooting and resolving issues related to data processing, storage, and retrieval, while maintaining high code quality throughout the development lifecycle. Collaboration with stakeholders is key, as the Data Engineer will gather and define data requirements to ensure alignment with business objectives. Documentation of data engineering processes is essential for knowledge transfer and ease of system maintenance. The Data Engineer will also participate in on-call rotations to address critical issues and provide mentorship to junior team members, fostering a collaborative environment.