Zelis - Tallahassee, FL

posted 2 months ago

Full-time - Mid Level
Remote - Tallahassee, FL

About the position

As a Senior Data Engineer at Zelis, you will play a crucial role in building high-level technical designs for both streaming and batch processing systems. Your primary responsibility will be to design and build reusable components, frameworks, and libraries at scale to support our analytics data products. You will also perform proof of concepts (POCs) on new technologies and architecture patterns, ensuring that our data management practices are robust and efficient. Collaboration is key in this role, as you will work closely with business and technology stakeholders to design and implement product features that meet our organizational needs. In addition to your design responsibilities, you will be tasked with anticipating, identifying, and solving issues related to data management to enhance data quality. This includes cleaning, preparing, and optimizing data at scale for ingestion and consumption. You will drive the implementation of new data management projects and restructure our current data architecture to improve efficiency and effectiveness. Your expertise will be essential in implementing complex automated workflows and routines using workflow scheduling tools, as well as building continuous integration, test-driven development, and production deployment frameworks. As a leader in the team, you will conduct collaborative reviews of design, code, test plans, and dataset implementations performed by other data engineers, ensuring adherence to data engineering standards. You will analyze and profile data to design scalable solutions and troubleshoot complex data issues, performing root cause analysis to proactively resolve product and operational challenges. Furthermore, you will mentor and develop offshore data engineers, guiding them in adopting best practices and delivering high-quality data products. Partnering closely with product management, you will understand business requirements, break down epics, and collaborate with engineering managers to define technology roadmaps and align on design, architecture, and enterprise strategy.

Responsibilities

  • Build high-level technical design for streaming and batch processing systems.
  • Design and build reusable components, frameworks, and libraries at scale to support analytics data products.
  • Perform POCs on new technology and architecture patterns.
  • Design and implement product features in collaboration with business and technology stakeholders.
  • Anticipate, identify, and solve issues concerning data management to improve data quality.
  • Clean, prepare, and optimize data at scale for ingestion and consumption.
  • Drive the implementation of new data management projects and restructure the current data architecture.
  • Implement complex automated workflows and routines using workflow scheduling tools.
  • Build continuous integration, test-driven development, and production deployment frameworks.
  • Drive collaborative reviews of design, code, test plans, and dataset implementation performed by other data engineers.
  • Analyze and profile data for the purpose of designing scalable solutions.
  • Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational issues.
  • Lead, mentor, and develop offshore data engineers in adopting best practices and delivering data products.
  • Partner closely with product management to understand business requirements and breakdown epics.
  • Partner with engineering managers to define technology roadmaps and align on design, architecture, and enterprise strategy.

Requirements

  • Minimum of 6 years experience with Snowflake (Columnar MPP Cloud data warehouse).
  • Experience with ETL tools such as DBT, Informatica, Matillion, Talend, or Azure Data Factory.
  • Proficiency in Python.

Nice-to-haves

  • Experience with SQL objects (procedures, triggers, views, functions) in SQL Server.
  • Knowledge of SQL query optimizations.
  • Understanding of T-SQL, indexes, stored procedures, triggers, functions, views, etc.
  • Experience in design and development of Azure/AWS Data Factory Pipelines.
  • Experience in design and development of data marts in Snowflake.
  • Working knowledge of Azure/AWS Architecture, Data Lake, Data Factory.
  • Business analysis experience to analyze data to write code and drive solutions.
  • Knowledge of Git, Azure DevOps, Agile, Jira, and Confluence.
  • Working knowledge of Erwin for data modeling.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service