Centene - Jefferson City, MO

posted 19 days ago

Full-time - Mid Level
Jefferson City, MO
Ambulatory Health Care Services

About the position

The position focuses on supporting the Threat Infrastructure Security tools within Centene, emphasizing data engineering and science for a Security Data Lake and automation projects. The role involves developing strategies, designing efficient data pipelines, managing data storage, and ensuring data quality. It also includes responsibilities such as ETL processes, code review, operational support, data analysis, model building, and collaboration with cross-functional teams to deliver data-driven insights.

Responsibilities

  • Develop strategy and execute design for data pipelines into the Security Data Lake.
  • Store and manage ingested data, optimizing data schemas and ensuring data quality.
  • Design ETL pipelines to transform raw data for analysis, including data cleansing and enrichment.
  • Review code submissions from team members and mentor junior developers on best practices.
  • Perform operational work including troubleshooting and participate in on-call rotation.
  • Acquire and ensure the accuracy and completeness of data from various sources.
  • Analyze large datasets using statistical and machine learning techniques to identify trends and insights.
  • Develop predictive models using machine learning algorithms to solve business problems.
  • Create visualizations to present data findings to stakeholders.
  • Collaborate with cross-functional teams to understand their needs and deliver insights.
  • Design and conduct A/B tests to validate hypotheses or test strategies.
  • Provide strategic advice on leveraging data for business growth and decision-making.
  • Streamline data processes and automate repetitive tasks for efficiency.
  • Stay updated with trends in software architecture and AI technologies.
  • Troubleshoot and solve complex technical problems as they arise.
  • Participate in or conduct training sessions to enhance team skills.

Requirements

  • Bachelor's degree in a quantitative or business field (e.g., statistics, mathematics, engineering, computer science).
  • 5 - 7 years of related experience or equivalent experience.
  • Proficiency in Databricks platform.
  • Advanced data pipeline design and development skills.
  • Knowledge of data quality and governance.
  • Experience in machine learning model development and maintenance.
  • Familiarity with data integration processes and data security regulations.
  • Proficiency in SQL and Python.
  • Knowledge of Big Data technologies (Hadoop, Spark).
  • Experience with cloud computing (AWS, Azure, GCP).
  • Strong problem-solving skills and excellent communication skills.

Nice-to-haves

  • CompTIA Security+ Certification, CISSP, or Splunk Certifications preferred.
  • Databrick Certification or Python Certifications preferred.
  • Professional Data Engineer (Google Cloud) or AWS Certified Big Data preferred.

Benefits

  • Competitive pay
  • Health insurance
  • 401K and stock purchase plans
  • Tuition reimbursement
  • Paid time off plus holidays
  • Flexible work schedules (remote, hybrid, field, or office)
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service