This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

Centene Corporationposted 3 months ago
$85,300 - $158,100/Yr
Full-time • Mid Level
MI
Resume Match Score

About the position

You could be the one who changes everything for our 28 million members by using technology to improve health outcomes around the world. As a diversified, national organization, Centene's technology professionals have access to competitive benefits including a fresh perspective on workplace flexibility. Position Purpose: Develops and operationalizes data pipelines to make data available for consumption (reports and advanced analytics), including data ingestion, data transformation, data validation / quality, data pipeline optimization, and orchestration. Engages with the DevSecOps Engineer during continuous integration and continuous deployment. Helps guide the design and implementation of complex data management procedures around data staging, data ingestion, data preparation, data provisioning, and data destruction (scripts, programs, automation, assisted by automation, etc.) Manage and optimize data engineering processes focused on streaming data. Utilize IDMC and Snowflake for data warehousing tasks. Leverage Python and Snowpark for advanced data operations. Provides guidance to data engineers in the design, development, implementation, testing, documenting, and operating of large-scale, high-volume, high-performance data structures for business intelligence analytics Designs, develops, and maintains real-time processing applications and real-time data pipelines Ensure quality of technical solutions as data moves across Centene’s environments Provides senior level knowledge and insight into the changing data environment, data processing, data storage, and utilization requirements for the company and drives the team toward viable solutions Develops, constructs, tests, and maintains architectures using advanced programming language and tools Drives ways to improve data reliability, efficiency, and quality and deploys a solution; use data to discover tasks that can be automated Performs other duties as assigned Complies with all policies and standards.

Responsibilities

  • Develops and operationalizes data pipelines for data consumption.
  • Engages with the DevSecOps Engineer during continuous integration and continuous deployment.
  • Guides the design and implementation of complex data management procedures.
  • Manages and optimizes data engineering processes focused on streaming data.
  • Utilizes IDMC and Snowflake for data warehousing tasks.
  • Leverages Python and Snowpark for advanced data operations.
  • Provides guidance to data engineers in the design, development, implementation, testing, documenting, and operating of data structures.
  • Designs, develops, and maintains real-time processing applications and data pipelines.
  • Ensures quality of technical solutions as data moves across environments.
  • Provides senior level knowledge and insight into the changing data environment.
  • Develops, constructs, tests, and maintains architectures using advanced programming languages and tools.
  • Drives improvements in data reliability, efficiency, and quality.

Requirements

  • A Bachelor's degree in a quantitative or business field (e.g., statistics, mathematics, engineering, computer science).
  • 4 – 6 years of related experience or equivalent experience.
  • Experience with Big Data and Data Processing.
  • Experience diagnosing system issues, engaging in data validation, and providing quality assurance testing.
  • Experience with Data Manipulation and Data Mining.
  • Experience working in a production cloud infrastructure.
  • Proven expertise in IDMC and Snowflake.
  • Knowledge of Python and Snowpark is advantageous.
  • Strong understanding of cloud concepts related to AWS, Snowflake, and the Informatica IDMC platform.

Nice-to-haves

  • Intermediate ability to identify basic problems and procedural irregularities.
  • Intermediate ability to work independently.
  • Intermediate demonstrated analytical skills.
  • Intermediate demonstrated project management skills.
  • Intermediate ability to demonstrate a high level of accuracy under pressure.
  • Intermediate judgment and decision-making skills.

Benefits

  • Competitive pay.
  • Health insurance.
  • 401K and stock purchase plans.
  • Tuition reimbursement.
  • Paid time off plus holidays.
  • Flexible approach to work with remote, hybrid, field or office work schedules.

Job Keywords

Hard Skills
  • Big Data Analytics
  • Data Manipulation Language
  • Informatica
  • Python
  • Snowflake
  • 1QOMs LorSu4AXtOjJ mQDiUF28EX
  • 4aFec DTFw3ovV 7UqJG8xzmTt
  • 4juzU rKaWG7zmvx
  • 5CPOf lFJLv8dYZWG
  • 6fA2arcl0 3pV9DIce7j
  • 6gxci K9WsQhi15N
  • 83C5iEL kXrD 5tOaCGL2i sEAGevKpc
  • c19CP VAUaqZbyo70
  • cBAi9IWt5 jYuO7ndrqRKDQ
  • CTu7Dk9gYjA mQWCF APGnLo0Olu9
  • cZTNA RKLIidS4bBv
  • eonBr vmtgh3U1uPzL
  • ERnOZPcKyTw 3c1g2e4d
  • G641D tXemuhVAxv4D
  • hDsgA pEmrjftw
  • IXDAoq3Rt m8H9PCJh
  • mMi7s 1HXJ7V48ZS
  • NjWPTLf 5Hnty s4cxUBif
  • oNrgYOBL 3wLB8XfpR6
  • rtWEJFOp aULhEAI1FNk
  • Rw92Dt Pk0vqRDOYMXwotU
  • sZapu SxkszbUQf3
  • Txw6Gylm5h8 djwW02ct5sq
  • XTLPEucFnhqoG rJOxXR9isnc
  • xX3e8bPtSyUg TXSujiOQdr
  • Ya2nP 2t7ZzKWuG1
  • ZOMlD Rfe7G1U6Zvc0
Soft Skills
  • ETJYQN2vx czxRQoP
  • zPGc3DS8o2w LVrv6qz
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service