Cloud Data Engineer

$61,000 - $103,000/Yr

Ford - Bismarck, ND

posted 2 months ago

Full-time - Mid Level
Bismarck, ND
Transportation Equipment Manufacturing

About the position

At Ford Motor Company, we are dedicated to shaping the future of mobility and enhancing the customer experience through advanced technology. As a Cloud Data Engineer, you will play a pivotal role in the Materials Management Platform (MMP), a multi-year transformation initiative aimed at revolutionizing Ford's Materials Requirement Planning and Inventory Management capabilities. This position involves designing and deploying a Data Centric Architecture in Google Cloud Platform (GCP) that integrates data from various applications, both modern and legacy, across Product Development, Manufacturing, Finance, Purchasing, N-Tier Supply Chain, and Supplier Collaboration. Your responsibilities will include designing and implementing data-centric solutions using a variety of GCP tools such as Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data Flow, and more. You will build ETL pipelines to ingest data from diverse sources, develop data processing pipelines using programming languages like Java and Python, and create and maintain data models for efficient storage and retrieval of large datasets. Additionally, you will deploy and manage both SQL and NoSQL databases, collaborate with cross-functional teams to understand data requirements, and implement security measures to ensure data integrity and confidentiality. As part of your role, you will optimize data workflows for performance and cost-effectiveness, work with data scientists to integrate machine learning models, troubleshoot data processing issues, and stay updated on industry best practices. You will also be responsible for documenting data engineering processes, utilizing GCP monitoring tools, and providing mentorship to junior team members. This position requires a strong background in GCP data projects, preferably with automotive experience, and a commitment to continuous improvement in data engineering processes.

Responsibilities

  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools.
  • Build ETL pipelines to ingest data from heterogeneous sources into our system.
  • Develop data processing pipelines using programming languages like Java and Python.
  • Create and maintain data models for efficient storage, retrieval, and analysis of large datasets.
  • Deploy and manage databases, both SQL and NoSQL, based on project requirements.
  • Collaborate with cross-functional teams to understand data requirements and design scalable solutions.
  • Implement security measures and data governance policies to ensure data integrity and confidentiality.
  • Optimize data workflows for performance, reliability, and cost-effectiveness on GCP infrastructure.
  • Work with data scientists and analysts to integrate machine learning models into data pipelines.
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Stay updated on industry best practices and emerging technologies within the GCP ecosystem.
  • Collaborate with stakeholders to gather and define data requirements.
  • Design and implement scalable, fault-tolerant solutions for data ingestion, processing, and storage.
  • Develop and maintain documentation for data engineering processes.
  • Utilize GCP monitoring and logging tools to identify and address performance bottlenecks.
  • Implement version control and CI/CD practices for data engineering workflows.
  • Collaborate with DevOps teams to automate infrastructure provisioning and management tasks.
  • Optimize and tune database queries and indexing strategies for improved performance.
  • Provide mentorship and guidance to junior team members.
  • Participate in on-call rotations to address critical issues.

Requirements

  • Bachelor's degree or equivalent work experience (minimum of 12 years), or an associate degree with a minimum of 6 years of equivalent work experience.
  • At least 5 years of experience in leading/implementing GCP data projects, preferably implementing a complete data centric model.
  • Automotive experience is preferred.
  • Support in an onshore/offshore model is preferred.
  • Excellent problem-solving skills.
  • Knowledge and practical experience of agile delivery.

Nice-to-haves

  • Experience in IDOC processing, APIs and SAP data migration projects.
  • Experience working in SAP S4 Hana environment.

Benefits

  • Immediate medical, dental, and prescription drug coverage.
  • Flexible family care, parental leave, new parent ramp-up programs, subsidized back-up child care.
  • Vehicle discount program for employees and family members.
  • Tuition assistance.
  • Established and active employee resource groups.
  • Paid time off for individual and team community service.
  • A generous schedule of paid holidays, including the week between Christmas and New Year's Day.
  • Paid time off and the option to purchase additional vacation time.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service