This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

The Hartfordposted 21 days ago
$127,200 - $190,800/Yr
Full-time • Manager
Hybrid • Chicago, IL
Insurance Carriers and Related Activities
Resume Match Score

About the position

The Hartford is seeking a Manager of Data Engineering within Workers' Compensation and Group Benefits Claims Data Science to lead a team of Data Engineers to design, develop, and implement modern and sustainable data assets to fuel machine learning and artificial intelligence solutions across a wide range of strategic initiatives designed to drive efficiency within Workers' Compensation and Group Benefits claims process. As a Manager of Data Engineering, you will lead and mentor a small team through the entire software development lifecycle process in support of continuous data delivery, while growing your knowledge of emerging technologies. We use the latest data technologies, software engineering practices, MLOPs, Agile delivery frameworks, and are passionate about building well-architected and innovative solutions that drive business value. This cutting edge and forward focused organization presents the opportunity for collaboration, self-organization within the team, and visibility as we focus on continuous business data delivery. This role will have a Hybrid work schedule, with the expectation of working in an office (Columbus, OH, Chicago, IL, Hartford, CT or Charlotte, NC) 3 days a week (Tuesday through Thursday).

Responsibilities

  • Lead and mentor data engineers to deliver and maintain reusable and sustainable data assets and production pipelines that assist the functional business units in meeting their strategic objectives
  • Prototype and lead deployment of high impact innovations, catering to changing business needs, by leveraging new technologies
  • Consult with cross-functional stakeholders in the analysis of short and long-range business requirements and recommend innovations which anticipate the future impact of changing business needs. Distill these requirements into user stories and action items for team members.
  • Formulate logical statements of business problems and devise, test and implement efficient, cost-effective application program solutions
  • Identify and validate internal and external data sources for availability and quality. Work with SMEs to describe and understand data lineage and suitability for a use case
  • Create data assets and build data pipelines that align to modern software development principles for further analytical consumption. Perform data analysis to ensure quality of data assets
  • Develop code that enables real-time modeling solutions to be ingested into front-end systems
  • Produce code artifacts and documentation using GitHub for reproducible results and hand-off to other data science teams and ensure high quality standards for direct reports

Requirements

  • 6+ years of relevant experience recommended
  • Bachelor's degree in Computer Science, Engineering, IT, MIS, or a related discipline
  • Experience in managing Data Engineering initiatives in an Agile environment
  • Expertise in Python and SQL
  • Proficiency in ingesting data from a variety of structures including relational databases, Hadoop/Spark, cloud data sources, XML, JSON
  • Proficiency in ETL concerning metadata management and data validation
  • Expertise in Unix and Git
  • Proficiency in Automation tools (Autosys, Cron, Airflow, etc.)
  • Proficiency with AWS Services (i.e. S3, EMR, etc) a plus
  • Proficiency with Cloud data warehouses, automation, and data pipelines (i.e. Snowflake, Redshift) a plus
  • Able to communicate effectively with both technical and non-technical teams
  • Able to translate complex technical topics into business solutions and strategies as well as turn business requirements into a technical solution
  • Experience with leading project execution and driving change to core business processes through the innovative use of quantitative techniques
  • Experience building CICD pipeline using Jenkins or equivalent.
  • Experience with Solution Design and Architecture of data and ML pipelines as well as integrating with Enterprise systems.
  • Good understanding and experience building orchestration framework for real-time and batch services.
  • Experience building asynchronous or event-driven services in cloud environment.
  • Familiar with BI tools (Tableau, PowerBI, ThoughtSpot, etc.)
  • Candidate must be authorized to work in the US without company sponsorship.

Benefits

  • Short-term or annual bonuses
  • Long-term incentives
  • On-the-spot recognition

Job Keywords

Hard Skills
  • Cron
  • Git
  • Github
  • Jenkins
  • JSON
  • 0h6RLTOBA k7o3nVCEqSWvB
  • 13HOBoeLhwf rVZfGlw74gP0
  • 18yXG 5nu9w4J1vThZ
  • 2g1kBh AoJ3YFqwI
  • 3X8rONTnWZ QKINCnlORtUdbh
  • 6D9no x2hvgjP3yb
  • 7M2Ad4OD vMfz5ljWdJg
  • 941tF cAXH8GTmdj
  • A9kMo PuMe2I9o
  • Bs2N0Rg9qpl8 BTVXOcrfU
  • cbkigxa9
  • dKILZb5 ZlqM6jhB
  • FAByiKX2kg
  • fALXHS1Om iPtYDgNhs
  • gJO1T NiIQ4mpg
  • i0DFG45Qk gIo6vmME
  • IOR9JaM3
  • K0WBQ9fx UAYMqTsk7
  • K9ETf Bdyc3Pkt
  • KdSm5 DsSYVb9qvAgJ
  • KF1jkUlyt ERrcHv2QuPV
  • KrgVwGphC XvJbEiQW
  • kTqHm 6q5lCOtX
  • LeuiYUnsd pCedDjEr28hP
  • MX62KpbUEs KfkrT0izUhu
  • N0Zy
  • o3yFjCSZhd kYQ1fxIimoaHt
  • odXH USmyKCxJ
  • OLVmtuCiE Ab8WeYd45DHv
  • oVQcpi7tKbx J40A
  • PNom6 FKQux5EjRhPZ2
  • Qh0eRgKPV9 XzHTEIUL9GM
  • QrF 5wcfNz14yhglW acIq1BL
  • r8CTfe9 HkzW9
  • SmACVEt
  • u4TrKix 89xcy
  • UrZPC 8RDkwAWI9udP
  • VyYN7 xng1tf
  • wRC3tGoiO tCj2kpRzANLs
  • XwY3p X5wxEt3RGH0
  • yUJ64p3mj njocUwJyWmg9C
  • Z6RIs
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service