This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

Amerihealth Caritas Health Planposted about 1 month ago
Mid Level
Hybrid • Newtown Square, PA
Insurance Carriers and Related Activities
Resume Match Score

About the position

We are seeking an Azure Data Engineer with an extensive background in building data pipelines using Azure Data Factory and Databricks to join our growing Enterprise Data Office. You will be designing and implementing optimal data pipelines/flows from multiple sources; experience with Snowflake and Azure Synapse is a plus. If you are a confident, skilled, and energetic professional and can complement a blended team, we want to meet you.

Responsibilities

  • Build and maintain scalable automated data pipelines.
  • Support critical data pipelines with a highly scalable distributed architecture - including data ingestion (streaming, events and batch), data integration, data curation.
  • Demonstrated experience of turning business use cases and requirements into technical solutions.
  • Expert level understanding on Azure Data Factory, SQL, ADLS, Azure Databricks, PySpark, Azure DevOps, Kafka, Python is required.
  • Experience in business processing mapping of data and analytics solutions.
  • Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Work with stakeholders including the Product, Data and Design teams to assist with data-related technical issues and support their data needs.
  • Identify and tackle issues concerning data management to improve data quality.
  • Understand and implement best practices in management of data, including master data, reference data, metadata, data quality and lineage.
  • Strong knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential.
  • Strong team collaboration and experience working with remote teams.

Requirements

  • At least five (5)+ years of experience with Azure Data Engineering.
  • At least four (4)+ years of hands-on experience in Azure Databricks and Azure Data Factory.
  • At least five (5)+ years of experience programming in Python, PySpark, Scala, Shell, T-SQL etc.
  • Experience as Data Engineer in Azure Big Data Environment.
  • Expertise in ETL tools i.e. (SSIS, Data Stage).
  • Expertise in Implementing Data Warehousing and Data Lake Solutions.
  • Experience with working in Agile environment as part of a scrum team.
  • Hands-on experience in Azure stack and hands-on experience on HVR, Kafka and Snowflake.
  • Good understanding of Azure Databricks platform and ability to build data analytics solutions to support the required performance & scale.
  • Good understanding and experience working on Deltalake using Apache Delta and Unity Catalog in Azure Databricks.
  • Experienced working on performance tuning and optimizing long running jobs.
  • Demonstrated analytical and problem-solving skills, particularly those that apply to a big data environment.
  • Solid understanding of Modern Data Warehouse and Data Warehousing concepts.
  • Proficient in a source code control system.
  • Bachelor's degree in technology, related field or equivalent combination of education and relevant work experience.

Nice-to-haves

  • Experience with Snowflake and Azure Synapse.
  • HVR, Snowflake, Azure Synapse, SQL Server, Collibra Governance tools is a plus.
  • Experience preparing data for Data Science and Machine Learning.
  • Experience preparing data for use in Azure Databricks.
  • Demonstrated experience preparing data and building data pipelines reusable components and frameworks.

Benefits

  • Flexible work solutions including remote options, hybrid work schedules.
  • Competitive pay.
  • Paid time off including holidays and volunteer events.
  • Health insurance coverage for you and your dependents on Day 1.
  • 401(k).
  • Tuition reimbursement.

Job Keywords

Hard Skills
  • Azure Data Factory
  • Azure Pipelines
  • Databricks
  • Python
  • SQL
  • 1VCouS8XfD QzdO fhvrIamOAl
  • 7rhVaF
  • 7wbl1 SK1jePxkoifM
  • ALKx
  • bxZR6wGvs f1V0D IHVL2YhcFb
  • D5IC0 mNG3dtj2Zqs
  • dErM1 CR1jOY40glPp
  • F6txmoiCfr
  • FA7G N6R2F
  • FCevz cmZj3CSl
  • gYh8G57D
  • I7Mxu V7kHaEG8BJWv
  • Iule8 jeFESNIlp8h
  • K9jcSwlxqQ kyUFO0V
  • KlYp
  • LaWMIDildUj
  • M3BCXvhQrx5V ylvsBCg
  • M65ri HAmSld2sfj
  • m6ItE0qPO
  • mf8La SxVkIs1K
  • NCmAh 1LtMOV8sp69h
  • O8Uwydse idITBxsrLwQPq
  • oAZSm C9kDLHyPKa
  • pBJWY 2crabjYCi1
  • POV4Z WAQ9fDwNRB
  • qEYxnoiX f0ybKd2zojt1
  • Qr7uc6
  • rD7Npc pqmbJRP
  • ReHrv2l 8hOF TuD2pX MZmf1GiJ5
  • RPdDq kGAY0CHE875S
  • TRUah 8gqJ09smoP
  • uw7L evxRq
  • w7zHs4fk1XAKGJ6 0Bf IobLX
  • wYqxyIciM Jrt20RIp
  • X8vU971tZyM
  • xdNJn kIydxr7CAc
  • z0wJp 8L4kvS
  • ZJVjT u7kNsD6o5I
Soft Skills
  • SEzd2lsT MGIDqSPF
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service