Sr. Data Engineer - REMOTE

$155,000 - $195,000/Yr

First American Financial - Santa Ana, CA

posted 2 months ago

Full-time - Mid Level
Remote - Santa Ana, CA
Insurance Carriers and Related Activities

About the position

First American is seeking a Senior Data Engineer to enhance our data engineering capabilities within the Strategic Product Development Organization. This role is pivotal in helping the Data Engineering group make real-time decisions based on extensive datasets, facilitating analytics on existing products, and providing insights into potential new growth opportunities. As a Senior Data Engineer, you will collaborate with subject matter experts to address business challenges through data solutions. Additionally, you will mentor junior data engineers, offering guidance and support across various aspects of data engineering. In this position, you will be responsible for creating scalable, maintainable, and reliable data practices and pipelines that can process large volumes of both structured and unstructured data from the ground up. You will identify, design, and implement internal process improvements, such as automating manual processes, optimizing data delivery, and redesigning infrastructure for enhanced scalability. A significant part of your role will involve building and enhancing a shared data lake that supports decision-making and model building. You will work closely with teams across the organization to understand their data needs and develop comprehensive end-to-end data solutions. Collaboration with analysts and data scientists will be essential for performing exploratory analysis and troubleshooting any issues that arise. Furthermore, you will manage and model data using visualization tools to create a collaborative data analytics platform, ensuring that the right data is accessible to the appropriate stakeholders.

Responsibilities

  • Create scalable, maintainable, reliable data practices and data pipelines that process very large quantities of structured and unstructured data from scratch.
  • Identify, design, and implement internal process improvements, such as automating manual processes, optimizing data delivery, and redesigning infrastructure for greater scalability.
  • Build and enhance a shared data lake that powers decision-making and model building.
  • Partner with teams across the business to understand their needs and develop end-to-end data solutions.
  • Collaborate with analysts and data scientists to perform exploratory analysis and troubleshoot issues.
  • Manage and model data using visualization tools to provide the company with a collaborative data analytics platform.
  • Build tools and processes to help make the correct data accessible to the right people.

Requirements

  • 5+ years of development experience with any of the following software languages: Python, Scala, or SQL.
  • Proven professional working experience in Event Streaming Platforms and data pipeline orchestration tools like Apache Kafka, Fivetran, Apache Airflow, or similar tools.
  • Proven professional working experience in any of the following: Databricks, Snowflake, BigQuery, Spark in any flavor, HIVE, Hadoop, Cloudera or RedShift.
  • Experience in schema design, data ingestion/processing experience preferred.
  • Experience developing in a containerized local environment like Docker, Rancher, or Kubernetes preferred.
  • Experience in orchestrating data processing jobs using Apache Airflow preferred.
  • Bachelor's degree in Computer Science (or related field) or equivalent combination of education and experience.

Nice-to-haves

  • Experience with data visualization tools.
  • Familiarity with machine learning concepts and frameworks.

Benefits

  • Medical insurance
  • Dental insurance
  • Vision insurance
  • 401k plan
  • Paid time off (PTO)
  • Paid sick leave
  • Employee stock purchase plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service