Anywhere Real Estate - Cleveland, OH

posted about 2 months ago

Full-time - Senior
Cleveland, OH
10,001+ employees
Real Estate

About the position

The Lead Engineer Data Platform role at Anywhere Real Estate involves leading the development of innovative data platform tools that enhance the company's data capabilities. This position is focused on building and optimizing data ingestion services, ensuring data quality, and implementing observability tools to support the company's digital transformation in the real estate industry.

Responsibilities

  • Work with other Data Engineers for the build-out of the Next Generation Data Platform.
  • Design and develop a Data Ingestion Service for real-time streaming of data from SQL Server, My SQL, and Oracle using CDC-based technologies.
  • Design and develop a Data Ingestion Service for real-time streaming of data from third-party APIs, internal micro-services, and files stored in S3/sftp servers.
  • Design and develop a Data Platform Storage Optimization & Delta Detection Service using Apache Iceberg.
  • Design and develop a Data Catalog Service using Snowflake Horizon and Polaris.
  • Design and develop Data Observability using DataDog and Data Recon to detect data anomalies.
  • Design and develop a CI/CD process for continuous delivery in AWS Cloud and Snowflake.
  • Design, develop, and test robust, scalable data platform components.

Requirements

  • Bachelor's in Computer Science, Engineering, or related technical discipline or equivalent combination of training and experience.
  • 10+ years programming experience: building application frameworks, and back-end systems for high-volume pipelines using Java/Python.
  • 10+ years' experience building data frameworks, and platforms and scaling them to large volumes of data.
  • 5+ years' experience building streaming platforms using Apache Kafka, Confluent Kafka, AWS Managed Kafka Service.
  • 5+ years' experience ingesting data from SQL Server/My SQL/Oracle using Change Data Capture, debezium, and Kafka Connect.
  • 5 years' experience using AWS Data Services: DMS, EMR, Glue, Athena, S3, and Lambda.
  • 2 years' experience building Data Observability using Monte Carlo or DataDog.
  • 2 years' experience building data solutions using Apache Iceberg, Apache Hudi.
  • 1-year experience with data architecture, ETL and processing of structured and unstructured data.
  • 5 years' experience with DevOps tools (any combination of GitHub, TravisCI or Jira) and methodologies (Lean, Agile, Scrum, Test Driven Development).

Nice-to-haves

  • Strong analytical skills and attention to detail.
  • Excellent communication and collaboration skills.
  • Proactive and results-driven approach to problem-solving.
  • Ability to thrive in a fast-paced environment.

Benefits

  • Diversity and inclusion initiatives
  • Professional development opportunities
  • Recognition as a Great Place to Work
  • Commitment to ethical practices and diversity in hiring
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service