Zelis - Atlanta, GA

posted 2 months ago

Full-time - Senior
Remote - Atlanta, GA

About the position

The Senior Data Engineer role at Zelis involves designing and implementing scalable data solutions to support analytics data products. This position requires collaboration with business and technology stakeholders to enhance data management and quality, while also mentoring offshore data engineers in best practices. The role is pivotal in driving the implementation of new data management projects and optimizing existing data architecture.

Responsibilities

  • Build high-level technical design for streaming and batch processing systems.
  • Design and build reusable components, frameworks, and libraries at scale to support analytics data products.
  • Perform POCs on new technology and architecture patterns.
  • Design and implement product features in collaboration with business and technology stakeholders.
  • Anticipate, identify, and solve issues concerning data management to improve data quality.
  • Clean, prepare, and optimize data at scale for ingestion and consumption.
  • Drive the implementation of new data management projects and restructure the current data architecture.
  • Implement complex automated workflows and routines using workflow scheduling tools.
  • Build continuous integration, test-driven development, and production deployment frameworks.
  • Drive collaborative reviews of design, code, test plans, and dataset implementation performed by other data engineers.
  • Analyze and profile data for the purpose of designing scalable solutions.
  • Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational issues.
  • Lead, mentor, and develop offshore data engineers in adopting best practices and delivering data products.
  • Partner closely with product management to understand business requirements and breakdown epics.
  • Partner with engineering managers to define technology roadmaps, align on design, architecture, and enterprise strategy.

Requirements

  • Minimum of 6 years experience with Snowflake (Columnar MPP Cloud data warehouse).
  • Experience with ETL tools such as DBT, Informatica, Matillion, Talend, or Azure Data Factory.
  • Proficiency in Python.

Nice-to-haves

  • Experience with SQL objects (procedures, triggers, views, functions) in SQL Server.
  • Knowledge of SQL query optimizations.
  • Understanding of T-SQL, indexes, stored procedures, triggers, functions, views, etc.
  • Design and development of Azure/AWS Data Factory Pipelines preferred.
  • Design and development of data marts in Snowflake preferred.
  • Working knowledge of Azure/AWS Architecture, Data Lake, Data Factory.
  • Business analysis experience to analyze data to write code and drive solutions.
  • Knowledge of Git, Azure DevOps, Agile, Jira, and Confluence.
  • Working knowledge of Erwin for data modeling.

Benefits

  • Hybrid and remote friendly work culture.
  • Commitment to diversity, equity, inclusion, and belonging.
  • Equal employment opportunity policies.
  • Accessibility support for candidates with disabilities.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service