Ford - Dearborn, MI

posted 3 months ago

Full-time
Dearborn, MI
Transportation Equipment Manufacturing

About the position

We are looking for a Python Cloud Application Development Engineer for GCP. The job involves understanding the analytic needs of the Advanced Driver Assist Systems (ADAS) team and developing custom Cloud-based Python Big Data Applications. The engineer will create geo-analysis products using the rich Python ecosystem based on data collected from Connected Vehicles. The engineer will analyze diagnostics trouble codes (DTC's) and diagnostics identifiers (DIDs) associated with regulatory feature function and performance (e.g., rear view camera and other applicable ADAS features). In this role, you will work closely with the Advanced Driver Assist System (ADAS) engineering team to understand their data analytics needs. You will develop custom Python Web Applications using CloudRun (e.g., Dash, Streamlit, PySpark, and Flask) to process the very large datasets flowing from connected vehicles on the Google Cloud Platform (GCP) into Big Query (BQ) or Google Cloud Storage (GCS) and present tools to Driver Assistance Engineers. Additionally, you will develop automated data processing pipelines using Terraform, Docker, Tekton, GitHub, and Kubernetes (Openshift). You will also build and monitor dashboards based on the data in tools such as Looker, LookerStudio, Apache Superset, and Qliksense, collaborating with ADAS feature, systems, and diagnostics teams.

Responsibilities

  • Work with Advanced Driver Assist System (ADAS) engineering team to understand their data analytics needs.
  • Develop custom Python Web Applications using CloudRun (e.g., Dash, Streamlit, PySpark, and Flask) to process large datasets from connected vehicles.
  • Create geo-analysis products based on data collected from Connected Vehicles.
  • Analyze diagnostics trouble codes (DTC's) and diagnostics identifiers (DIDs) related to regulatory features.
  • Develop automated data processing pipelines using Terraform, Docker, Tekton, GitHub, and Kubernetes (Openshift).
  • Build and monitor dashboards using tools such as Looker, LookerStudio, Apache Superset, and Qliksense.
  • Collaborate with ADAS feature, systems, and diagnostics teams.

Requirements

  • BS Degree in Engineering, Physics, Computer Science, Data Science, or Control Systems.
  • Familiarity with ADAS feature functional interfaces, diagnostics, and regulatory requirements (e.g., Rear View Camera, Automatic Emergency Braking).
  • Strong expertise in creating web applications using Python frameworks such as Dash, Flask, Streamlit, and PySpark.
  • Experience with high-performance visualization packages like DataShader.
  • Strong expertise in Cloud Platforms such as GCP (Preferred), Azure, AWS, or Databricks.
  • Experience in writing SQL queries to process Cloud native Big Data.
  • Experience in CI/CD pipelines and tools like GitHub, Docker, Jenkins, Tekton, and Terraform.
  • GCP (Preferred), AWS or Azure certified.

Nice-to-haves

  • Experience in JavaScript for high-performance Big Data Geo Visualization (e.g., Deck.gl, Leaflet).
  • MS Degree in Engineering, Physics, Computer Science, Data Science, or Control Systems.
  • Experience in workflow automation tools such as Astronomer, Airflow, and DBT.
  • Experience with ADAS feature functional interfaces and diagnostics.
  • Experience creating RAG/LLM based tools for natural language queries of large datasets.

Benefits

  • Immediate medical, dental, and prescription drug coverage
  • Flexible family care, parental leave, new parent ramp-up programs, subsidized back-up child care
  • Vehicle discount program for employees and family members, and management leases
  • Tuition assistance
  • Established and active employee resource groups
  • Paid time off for individual and team community service
  • A generous schedule of paid holidays, including the week between Christmas and New Year's Day
  • Paid time off and the option to purchase additional vacation time.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service