Unite Here Health - Aurora, IL

posted 2 months ago

Full-time - Mid Level
Remote - Aurora, IL
Personal and Laundry Services

About the position

The Senior Data Engineer II will work with the Analytics and Data Engineering team to build a scalable data delivery platform to drive the use and adoption of a distributed data technology using a Cloud environment. The Sr Data Engineer will help deploy distributed technologies, automate technologies, upgrade paths, and monitor production implementation of healthcare data platform. The position is responsible for defining, developing and maintaining the modernization of data architecture, ETL processes, and data platform and solutions. There is opportunity to exercise freedom in building the foundation of this work and to be creative. Our Data Engineer will be someone who likes to build things and solutions ground up, and loves the opportunity to work with data warehousing stack modern technologies. The role involves driving research, design, and development of a secure, resilient, and self-healing data architecture foundation, including data warehouse/mart, data integration pipelines, and data-specific software components. The Senior Data Engineer will work alongside the Principal Data Architect to define and implement best practices, standards, and processes for development, analysis, testing, and tuning of big data solutions. This position leads the creation, maintenance, and optimization of ELT data pipelines from development to production, ensuring adherence to data integration and data quality standards across all development initiatives. In addition, the Senior Data Engineer will execute and oversee the analysis and remediation of root causes, including technological, procedural, or resource capability deficiencies. Operating in an agile model, the engineer will collaborate with architects, data engineers, data scientists, data analysts, business partners, and other developers in the delivery of data solutions. The role also includes providing technical guidance and mentorship to less experienced Data Engineers, fostering a culture of education and skill development. The engineer will introduce and help to understand new technologies to solve business problems, creating relevant prototypes where appropriate. The position requires designing, building, and maintaining modern cloud-based data platforms using technologies such as AWS, GCP, or Azure, while ensuring the ethical use, safety, and privacy of UHH and customer/patient data.

Responsibilities

  • Drives research, design, and development of a secure, resilient, and self-healing data architecture foundation, including data warehouse/mart, data integration pipelines, and data-specific software components.
  • Works alongside the Principal Data Architect to define and implement best practices, standards, and processes for development, analysis, testing, and tuning of big data solutions.
  • Leads the creation, maintenance, and optimization of ELT data pipelines from development to production.
  • Develops and enforces data integration and data quality standards across all development initiatives, adhering to the organization's information services policies and best practices.
  • Executes and oversees the analysis and remediation of root causes, including technological, procedural, or resource capability deficiencies.
  • Operates in an agile model alongside architects, data engineers, data scientists, data analysts, business partners, and other developers in the delivery of data solutions.
  • Provides technical guidance and mentorship to less experienced Data Engineers, fostering a culture of education and skill development.
  • Introduces and helps to understand new technologies to solve business problems, creating relevant prototypes where appropriate.
  • Design, build, and maintain modern cloud-based data platforms using technologies such as AWS, GCP, or Azure.
  • Ensures the ethical use, safety, and privacy of UHH and customer/patient data.
  • Design, build, and maintain scalable and efficient data pipelines and architectures.
  • Align systems with business goals and industry standard methodologies, ensuring data integrity and accessibility.
  • Maintains and enhances Data Warehouse ETL, management, data quality and analytics processes.
  • Interfaces with users and management regarding requirements, testing, and implementation.

Requirements

  • 5 - 7 years' of direct data engineering experience, ideally with healthcare benefits management organization.
  • Must have experience doing development work involving medical claims, pharmacy claims, and eligibility data and a conceptual understanding of healthcare benefit administration.
  • Demonstrated experience with a variety of relational and NoSQL technologies (e.g., Azure SQL Server, PostgreSQL, Cosmos DB).
  • Experience in a cloud platform (preferably Azure) and its related technical stack (e.g., Azure Data Factory, Synapse, DBT, Fivetran, Snowflake, Databricks, dremio, Airflow, NiFi).
  • Extensive Azure Data Technology design and implementation experience: ADF, Azure, SQL, Azure Data Bricks, Azure Analysis Services, Data Lakes and Power BI.
  • Strong technical understanding of data modeling, data mining, master data management, data integration, data architecture, data virtualization, data warehousing, and data quality techniques.
  • Strong knowledge in SQL, modern programming languages (e.g., Python, R), and common data pipeline/data science libraries.
  • Experience with Git repositories, CI/CD (preferably Azure DevOps), and software development tools, including incident tracking, version control, release management, and testing tools.
  • Experience working with data governance and data security - specifically in moving data pipelines into production with appropriate data quality, governance, and security standards, and certification.
  • Adept in agile methodologies and capable of applying DevOps principles and Data Operations practices to data pipelines.
  • Knowledge of CI/CD processes and source control tools such as GitHub and related dev processes.
  • Experience with Snowflake and utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
  • In-depth understanding of Data Warehouse/ODS, ETL concept and modelling structure principles, Data warehousing - OLTP, OLAP, Dimensions, Facts and Data modelling.
  • Familiarity with healthcare and security regulatory standards (e.g., HIPAA, CCPA).
  • Strong soft skills, including effective communication and stakeholder management.
  • Experience with Healthcare EDI transactions (837, 835, etc.) and/or Lab Data strongly preferred.

Nice-to-haves

  • Experience with healthcare data integration and analytics.
  • Familiarity with machine learning techniques and their application in data engineering.

Benefits

  • Medical
  • Dental
  • Vision
  • Paid Time-Off (PTO)
  • Paid Holidays
  • 401(k)
  • Pension
  • Short- & Long-term Disability
  • Life
  • AD&D
  • Flexible Spending Accounts (healthcare & dependent care)
  • Commuter Transit
  • Tuition Assistance
  • Employee Assistance Program (EAP)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service