This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

The Hartfordposted 17 days ago
$127,200 - $190,800/Yr
Full-time • Manager
Hybrid • Columbus, OH
Insurance Carriers and Related Activities

About the position

The Hartford is seeking a Manager of Data Engineering within Workers' Compensation and Group Benefits Claims Data Science to lead a team of Data Engineers to design, develop, and implement modern and sustainable data assets to fuel machine learning and artificial intelligence solutions across a wide range of strategic initiatives designed to drive efficiency within Workers' Compensation and Group Benefits claims process. As a Manager of Data Engineering, you will lead and mentor a small team through the entire software development lifecycle process in support of continuous data delivery, while growing your knowledge of emerging technologies. We use the latest data technologies, software engineering practices, MLOPs, Agile delivery frameworks, and are passionate about building well-architected and innovative solutions that drive business value. This cutting edge and forward focused organization presents the opportunity for collaboration, self-organization within the team, and visibility as we focus on continuous business data delivery. This role will have a Hybrid work schedule, with the expectation of working in an office (Columbus, OH, Chicago, IL, Hartford, CT or Charlotte, NC) 3 days a week (Tuesday through Thursday).

Responsibilities

  • Lead and mentor data engineers to deliver and maintain reusable and sustainable data assets and production pipelines that assist the functional business units in meeting their strategic objectives
  • Prototype and lead deployment of high impact innovations, catering to changing business needs, by leveraging new technologies
  • Consult with cross-functional stakeholders in the analysis of short and long-range business requirements and recommend innovations which anticipate the future impact of changing business needs. Distill these requirements into user stories and action items for team members.
  • Formulate logical statements of business problems and devise, test and implement efficient, cost-effective application program solutions
  • Identify and validate internal and external data sources for availability and quality. Work with SMEs to describe and understand data lineage and suitability for a use case
  • Create data assets and build data pipelines that align to modern software development principles for further analytical consumption. Perform data analysis to ensure quality of data assets
  • Develop code that enables real-time modeling solutions to be ingested into front-end systems
  • Produce code artifacts and documentation using GitHub for reproducible results and hand-off to other data science teams and ensure high quality standards for direct reports

Requirements

  • 6+ years of relevant experience recommended
  • Bachelor's degree in Computer Science, Engineering, IT, MIS, or a related discipline
  • Experience in managing Data Engineering initiatives in an Agile environment
  • Expertise in Python and SQL
  • Proficiency in ingesting data from a variety of structures including relational databases, Hadoop/Spark, cloud data sources, XML, JSON
  • Proficiency in ETL concerning metadata management and data validation
  • Expertise in Unix and Git
  • Proficiency in Automation tools (Autosys, Cron, Airflow, etc.)
  • Proficiency with AWS Services (i.e. S3, EMR, etc) a plus
  • Proficiency with Cloud data warehouses, automation, and data pipelines (i.e. Snowflake, Redshift) a plus
  • Able to communicate effectively with both technical and non-technical teams
  • Able to translate complex technical topics into business solutions and strategies as well as turn business requirements into a technical solution
  • Experience with leading project execution and driving change to core business processes through the innovative use of quantitative techniques
  • Experience building CICD pipeline using Jenkins or equivalent.
  • Experience with Solution Design and Architecture of data and ML pipelines as well as integrating with Enterprise systems.
  • Good understanding and experience building orchestration framework for real-time and batch services.
  • Experience building asynchronous or event-driven services in cloud environment.
  • Familiar with BI tools (Tableau, PowerBI, ThoughtSpot, etc)

Benefits

  • Short-term or annual bonuses
  • Long-term incentives
  • On-the-spot recognition
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service