Aarete - Chicago, IL

posted 3 months ago

Full-time - Mid Level
Chicago, IL
Professional, Scientific, and Technical Services

About the position

The Consultant, Data Architecture & Engineering at AArete plays a pivotal role in leading and executing data architecture and engineering initiatives for a diverse range of clients. This position requires a collaborative approach, working closely with cross-functional teams to design, build, and maintain scalable data solutions that ensure the highest levels of data quality and availability. The consultant will be responsible for developing advanced data engineering solutions using Snowflake SQL, applying data modeling principles to maintain data integrity, and optimizing data structures. In this role, you will build scalable ETL/ELT pipelines utilizing Apache Airflow and develop clean, efficient Python scripts for data transformation and analysis. You will also collaborate with business stakeholders to translate their requirements into effective technical solutions, ensuring that the technical processes and solutions are documented comprehensively in Confluence. Additionally, you will create and maintain data visualization dashboards and reports using Power BI, facilitating collaboration through Microsoft Teams and managing code versioning with BitBucket Git repositories. Maintaining foundational AWS infrastructure components and utilizing Jira for task tracking and service ticket management will also be key responsibilities of this position.

Responsibilities

  • Design, develop, and deploy advanced data engineering solutions using Snowflake SQL.
  • Apply data modeling principles to ensure data integrity and optimize data structures.
  • Build scalable ETL/ELT pipelines with Apache Airflow.
  • Develop clean, efficient Python scripts for data transformation and analysis.
  • Collaborate with business stakeholders to translate their requirements into effective technical solutions.
  • Document technical processes and solutions comprehensively in Confluence.
  • Create and maintain data visualization dashboards and reports using Power BI.
  • Facilitate collaboration through Microsoft Teams and manage code versioning with BitBucket Git repositories.
  • Maintain foundational AWS infrastructure components.
  • Utilize Jira for task tracking and service ticket management.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or related field; or equivalent work experience.
  • 3 to 5 years of experience in the healthcare industry, with a deep understanding of payer industry claims data.
  • Strong knowledge of Snowflake SQL.
  • Experience with Apache Airflow.
  • Proficiency in Python programming.
  • Basic understanding of AWS infrastructure.
  • Strong experience with Data Modeling concepts.
  • Familiarity with technical writing and Confluence.
  • Experience using Git and BitBucket.
  • Proficiency in Power BI for data visualization.
  • Familiarity with Jira for service ticket management.
  • Strong verbal and written communication skills.
  • Ability to work effectively in a collaborative team environment.

Nice-to-haves

  • Based in Chicago, IL and flexible to work from our Chicago office as needed.
  • DevOps & DataOps experience, particularly in CI/CD pipelines.
  • Pulumi Experience.
  • Writing Unit Tests.
  • Familiarity and experience with Agile (Scrum, Kanban, XP) way of working.
  • Ability to convey technical concepts to non-technical business stakeholders.

Benefits

  • Flexible PTO, monthly half-day refuels, volunteer time off, 10 paid holidays.
  • Own Your Day flexible work policy.
  • Competitive majority employer-paid benefits: Medical, Dental, Vision, 401K Match.
  • Employee Stock Ownership Plan.
  • Generous maternity/paternity leave options.
  • Completely employer paid Life Insurance, STD, LTD.
  • Charitable contribution matching program.
  • New client commission opportunities and referral bonus program.
  • Video-free Fridays.
  • Bike share discount program.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service