Geico - Chevy Chase, MD

posted about 2 months ago

Full-time - Mid Level
Chevy Chase, MD
Insurance Carriers and Related Activities

About the position

GEICO is seeking an experienced Senior Engineer with a passion for building high-performance, low maintenance, zero-downtime platforms, and applications. You will help drive our insurance business transformation as we transition from a traditional IT model to a techanization with engineering excellence as its mission, while co-creating the culture of psychological safety and continuous improvement. Our Senior Engineer is a key member of the engineering staff working across theanization to provide a friction-less experience to our customers and maintain the exit highest standards of protection and availability. Our team thrives and succeeds in delivering high quality technology products and services in a hyper-growth environment where priorities shift quickly. The ideal candidate has broad and deep technical knowledge, typically ranging from front-end UIs through back-end systems and all points in between. Candidates must have expertise in SQL and a strong knowledge of Data Engineering ETL concepts. They should have experience with at least one programming language (Python, Java, etc.), and be able to support new data development as well as maintain existing pipelines. Preferred experience would include a background in Databricks DBT, Python, Airflow, Azure Data Factory, and/or KAFKA. Team members will work to deliver customer needs within our enterprise data warehouse, which may include data ingestion, flattening, alerting, testing, transformation, optimization, and/or data mart development.

Responsibilities

  • Design and implement a data ingestion platform
  • Scope, design, and build scalable, resilient distributed systems
  • Build product definition and leverage your technical skills to drive towards the right solution
  • Engage in cross-functional collaboration throughout the entire software lifecycle
  • Lead in design sessions and code reviews with peers to elevate the quality of engineering across theanization
  • Define, create, and support reusable application components/patterns from a business and technology perspective
  • Build the processes required for optimal extraction, transformation, and loading of data
  • Work with other teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
  • Perform unit tests and conduct reviews with other team members to make sure code is rigorously designed, elegantly coded, and effectively tuned for performance
  • Mentor other engineers
  • Consistently apply best practices and improve processes within and across teams

Requirements

  • Experience developing new and enhancing existing data processing components (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality)
  • Advanced programming experience and big data experience
  • Understanding of data warehouse concepts including data modeling and OLAP
  • Experience working with cloud data solutions (Delta Lake, Iceberg, Hudi, Snowflake, Redshift or equivalent)
  • Experience with data formats such as Parquet, Avro, ORC, XML, JSON
  • Experience with designing, developing, implementing, and maintaining solutions for data ingestion and transformation projects
  • Experience working with streaming applications (Spark Streaming, Flink, Kafka or equivalent)
  • Data processing/data transformation using ETL/ELT tools such as DBT (Data Build Tool), or Databricks
  • Experience programming languages like Python, Scala, Spark, Java
  • Experience with container orchestration services including Docker and Kubernetes
  • Strong working knowledge of SQL and the ability to write, debug and optimize SQL queries and ETL jobs to reduce the execution window or reduce resource utilization
  • Experience with cloud computing (AWS, Microsoft Azure, Google Cloud)
  • Exposure to messaging such as Kafka, ActiveMQ, RabbitMQ or similar messaging technologies
  • Experience with REST, Microservices is a big plus
  • Experience with developing systems that are scalable, resilient, and highly available
  • Experience with Infrastructure as Code
  • Experience with CI/CD deployment and test automation. ADO, Jenkins, Gradle, Artifactory or equivalents
  • Experience with containerization (examples include Docker and Kubernetes)
  • Experience with version control systems such as GIT
  • Experience with load testing and load testing tools
  • Advanced understanding of monitoring concepts and tooling
  • Experience with Elastic Search, Dynatrace, Thousand Eyes, Influx, Prometheus, Grafana or equivalents
  • Experience architecting and designing new and current systems
  • Advanced understanding of DevOps concepts
  • Strong problem-solving ability
  • Ability to excel in a fast-paced environment
  • Knowledge of developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication)

Nice-to-haves

  • Experience with REST, Microservices is a big plus

Benefits

  • Premier Medical, Dental and Vision Insurance with no waiting period
  • Paid Vacation, Sick and Parental Leave
  • 401(k) Plan
  • Tuition Reimbursement
  • Paid Training and Licensures
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service