Zelis - Boston, MA

posted about 2 months ago

Full-time - Mid Level
Remote - Boston, MA

About the position

As a Senior Data Engineer at Zelis, you will play a pivotal role in building high-level technical designs for both streaming and batch processing systems. Your primary focus will be on designing and constructing reusable components, frameworks, and libraries at scale to support our analytics data products. You will be responsible for performing proof of concepts (POCs) on new technologies and architectural patterns, as well as designing and implementing product features in collaboration with both business and technology stakeholders. In this role, you will anticipate, identify, and resolve issues related to data management to enhance data quality. You will clean, prepare, and optimize data at scale for ingestion and consumption, driving the implementation of new data management projects and restructuring the current data architecture. Your expertise will be crucial in implementing complex automated workflows and routines using workflow scheduling tools, as well as building continuous integration, test-driven development, and production deployment frameworks. You will also lead collaborative reviews of design, code, test plans, and dataset implementations performed by other data engineers to maintain high data engineering standards. Analyzing and profiling data to design scalable solutions will be part of your responsibilities, along with troubleshooting complex data issues and performing root cause analysis to proactively resolve product and operational challenges. Additionally, you will mentor and develop offshore data engineers, ensuring they adopt best practices and deliver high-quality data products. Partnering closely with product management, you will understand business requirements, break down epics, and collaborate with engineering managers to define technology roadmaps and align on design, architecture, and enterprise strategy.

Responsibilities

  • Build high-level technical design for streaming and batch processing systems.
  • Design and build reusable components, frameworks, and libraries at scale to support analytics data products.
  • Perform POCs on new technology and architecture patterns.
  • Design and implement product features in collaboration with business and technology stakeholders.
  • Anticipate, identify, and solve issues concerning data management to improve data quality.
  • Clean, prepare, and optimize data at scale for ingestion and consumption.
  • Drive the implementation of new data management projects and restructure the current data architecture.
  • Implement complex automated workflows and routines using workflow scheduling tools.
  • Build continuous integration, test-driven development, and production deployment frameworks.
  • Drive collaborative reviews of design, code, test plans, and dataset implementation performed by other data engineers.
  • Analyze and profile data for the purpose of designing scalable solutions.
  • Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational issues.
  • Lead, mentor, and develop offshore data engineers in adopting best practices and delivering data products.
  • Partner closely with product management to understand business requirements and breakdown epics.
  • Partner with engineering managers to define technology roadmaps and align on design, architecture, and enterprise strategy.

Requirements

  • Minimum of 6 years experience with Snowflake (Columnar MPP Cloud data warehouse).
  • Experience with ETL tools such as DBT, Informatica, Matillion, Talend, or Azure Data Factory.
  • Proficiency in Python.

Nice-to-haves

  • Experience with SQL objects (procedures, triggers, views, functions) in SQL Server.
  • Knowledge of SQL query optimizations.
  • Understanding of T-SQL, indexes, stored procedures, triggers, functions, views, etc.
  • Experience in design and development of Azure/AWS Data Factory Pipelines.
  • Experience in design and development of data marts in Snowflake.
  • Working knowledge of Azure/AWS Architecture, Data Lake, Data Factory.
  • Business analysis experience to analyze data to write code and drive solutions.
  • Knowledge of Git, Azure DevOps, Agile, Jira, and Confluence.
  • Working knowledge of Erwin for data modeling.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service