Blue Cross Blue Shield - Meridian, ID

posted 2 months ago

Full-time - Mid Level
Remote - Meridian, ID
Insurance Carriers and Related Activities

About the position

The Data Engineer role at Blue Cross of Idaho is crucial for designing, developing, and maintaining robust data pipelines and architectures. This position requires collaboration with cross-functional teams to ensure the scalability, reliability, and efficiency of the data infrastructure, with a strong focus on cloud technologies, particularly AWS, and proficiency in Snowflake and dbt. The ideal candidate will also have experience in building event-driven applications, contributing to the overall data strategy of the organization.

Responsibilities

  • Design, develop, and deploy scalable data pipelines and architectures on AWS cloud infrastructure.
  • Implement and optimize data models using Snowflake and dbt for efficient data transformation and analysis.
  • Collaborate with data scientists, analysts, and software engineers to understand data requirements and ensure alignment with business objectives.
  • Build event-driven data processing systems to enable real-time data ingestion, processing, and analytics.
  • Implement ABC (Audit/Balance/Control), monitoring, alerting, and logging solutions to ensure the reliability and performance of data pipelines.
  • Evaluate and implement best practices for data governance, security, and compliance.
  • Mentor team members and provide technical guidance and support as needed.

Requirements

  • Bachelor's Degree in Computer Science, Electrical Engineering, Information Systems, or closely related field of study or equivalent work experience.
  • 2-4/+ years of experience in data engineering roles, focusing on building scalable data pipelines and architectures.
  • Proficiency in cloud technologies, particularly AWS, including services such as S3, EC2, Lambda, Glue, Kinesis, KMS/Kafka.
  • Expert level experience with Snowflake data warehouse platform, including data modeling, performance tuning, and administration.
  • Minimum 3/+ years of hands-on experience implementing a large enterprise application with very large data volumes with dbt for data transformation and orchestration.
  • Solid understanding of event-driven architecture principles and experience in building event-driven applications.
  • Proficiency in programming languages such as Python, Java, or Scala for data processing and scripting.
  • Experience with containerization technologies such as Docker, ECS, Fargate.
  • Excellent problem-solving skills and ability to work effectively in a fast-paced, collaborative environment.
  • Strong communication skills with the ability to effectively communicate technical concepts to non-technical partners.

Nice-to-haves

  • Experience in AWS KMS/Kafka is highly desirable.

Benefits

  • Health insurance coverage
  • Professional development opportunities
  • Flexible work location options (hybrid or remote)
  • Paid time off for volunteering activities
  • Employee wellness programs
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service