Anywhere Real Estate - San Francisco, CA

posted about 2 months ago

Full-time - Senior
San Francisco, CA
10,001+ employees
Real Estate

About the position

The Lead Engineer Data Platform role at Anywhere Real Estate involves leading the development of a next-generation data platform that supports the company's digital transformation efforts. This position requires a strong background in data engineering, with a focus on building tools for data ingestion, monitoring, and observability. The ideal candidate will collaborate with a team to create innovative data solutions that enhance the efficiency and effectiveness of the company's operations in the real estate sector.

Responsibilities

  • Work with other Data Engineers for the build-out of the Next Generation Data Platform.
  • Design and develop a Data Ingestion Service for real-time streaming of data from SQL Server, My SQL, and Oracle using CDC-based technologies.
  • Design and develop a Data Ingestion Service for real-time streaming of data from third-party APIs, internal micro-services, and files stored in S3/sftp servers.
  • Work with the team to design and develop a Data Platform Storage Optimization & Delta Detection Service using Apache Iceberg.
  • Work with the team to design and develop a Data Catalog Service using Snowflake Horizon and Polaris.
  • Work with the team to design and develop Data Observability using DataDog and Data Recon to detect data anomalies.
  • Design and develop a CI/CD process for continuous delivery in AWS Cloud and Snowflake.
  • Design, develop, and test robust, scalable data platform components.

Requirements

  • Bachelor's in Computer Science, Engineering, or related technical discipline or equivalent combination of training and experience.
  • 10+ years programming experience: building application frameworks, and back-end systems for high-volume pipelines using Java/Python.
  • 10+ years' experience building data frameworks, and platforms and scaling them to large volumes of data.
  • 5+ years' experience building streaming platforms using Apache Kafka, Confluent Kafka, AWS Managed Kafka Service.
  • 5+ years' experience ingesting data from SQL Server/My SQL/Oracle using Change Data Capture, debezium, and Kafka Connect.
  • 5 years' experience using AWS Data Services: DMS, EMR, Glue, Athena, S3, and Lambda.
  • 2 years' experience building Data Observability using Monte Carlo or DataDog.
  • 2 years' experience building data solutions using Apache Iceberg, Apache Hudi.
  • 1-year experience with data architecture, ETL and processing of structured and unstructured data.
  • 5 years' experience with DevOps tools (any combination of GitHub, TravisCI or Jira) and methodologies (Lean, Agile, Scrum, Test Driven Development).

Nice-to-haves

  • Strong analytical skills and attention to detail.
  • Excellent communication and collaboration skills.
  • Proactive and results-driven approach to problem-solving.
  • Ability to thrive in a fast-paced environment.

Benefits

  • Diversity and inclusion initiatives
  • Professional development opportunities
  • Recognition as a Great Place to Work
  • Ethical company recognition
  • Innovative company culture
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service