Geico - Chevy Chase, MD

posted about 1 month ago

Full-time - Mid Level
Chevy Chase, MD
Insurance Carriers and Related Activities

About the position

The Senior Engineer - Data ETL (SQL & Spark) at GEICO is a pivotal role focused on building high-performance, low-maintenance platforms and applications to support the company's transformation into a tech-driven organization. This position emphasizes engineering excellence, psychological safety, and continuous improvement, contributing to the delivery of high-quality technology products and services in a fast-paced environment. The ideal candidate will possess extensive technical knowledge across various systems and will be responsible for designing and implementing data ingestion platforms, collaborating across teams, and mentoring fellow engineers.

Responsibilities

  • Design and implement a data ingestion platform.
  • Scope, design, and build scalable, resilient distributed systems.
  • Build product definition and leverage technical skills to drive towards the right solution.
  • Engage in cross-functional collaboration throughout the entire software lifecycle.
  • Lead design sessions and code reviews with peers to elevate engineering quality.
  • Define, create, and support reusable application components/patterns from a business and technology perspective.
  • Build processes for optimal extraction, transformation, and loading of data.
  • Work with teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies.
  • Perform unit tests and conduct reviews to ensure code quality and performance.
  • Share knowledge of tech trends and mentor other engineering community members.

Requirements

  • 4+ years of professional software development experience in Java, Spark, Scala, or Python.
  • 3+ years of experience with architecture and design.
  • 3+ years of experience with AWS, GCP, Azure, or another cloud service.
  • 2+ years of experience in open-source frameworks.
  • Advanced programming experience and big data experience.
  • Strong working knowledge of SQL and ability to write, debug, and optimize SQL queries and ETL jobs.
  • Experience with cloud data solutions like Delta Lake, Iceberg, Hudi, Snowflake, or Redshift.
  • Experience with data formats such as Parquet, Avro, ORC, XML, JSON.
  • Experience with streaming applications (Spark Streaming, Flink, Kafka).
  • Experience with CI/CD deployment and test automation tools like ADO, Jenkins, Gradle.

Nice-to-haves

  • Experience with container orchestration services including Docker and Kubernetes.
  • Experience with load testing and load testing tools.
  • Experience with Elastic Search, Dynatrace, Thousand Eyes, Influx, Prometheus, Grafana.

Benefits

  • Premier Medical, Dental and Vision Insurance with no waiting period.
  • Paid Vacation, Sick and Parental Leave.
  • 401(k) Plan.
  • Tuition Reimbursement.
  • Paid Training and Licensures.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service