This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

Unclassified - Englewood, CO

posted about 2 months ago

Full-time - Principal
Remote - Englewood, CO

About the position

The Principal Data Engineer II at Charter Communications, Inc. is responsible for leading the design, development, and implementation of data pipelines and infrastructure to support hardware certification, Wi-Fi testing, and QA testing. This role involves collaborating with a team to develop data acquisition strategies, ensuring data quality, and providing technical leadership in data engineering best practices. The position requires strong technical skills in Python, Scala, and big data technologies, as well as the ability to communicate insights to senior leadership.

Responsibilities

  • Lead the design, development, and implementation of data pipelines and infrastructure.
  • Manage device and component vendors for hardware development.
  • Collaborate with the team to develop data acquisition strategies.
  • Ensure the accuracy, completeness, and quality of the data being collected.
  • Oversee and develop data models, ETL processes, and visualizations using Python and Scala.
  • Utilize Spark for data processing.
  • Work with PostgreSQL and cloud computing on Amazon Web Services.
  • Communicate data-driven insights and recommendations to senior leadership and stakeholders.
  • Provide technical leadership and guidance on data-related matters.
  • Mentor and coach team members to develop their technical skills.
  • Identify opportunities for using data to drive business outcomes.

Requirements

  • Bachelor's degree in Computer Science, Computer Engineering, or a related field.
  • 8 years of experience designing or implementing data pipelines and infrastructure.
  • Experience with big data technologies, including Spark.
  • Experience working with a wide range of data warehouses such as PostgreSQL, Athena, Redshift, RDS, MySQL, and DynamoDB.
  • 5 years of experience building ETL pipelines using Python or Scala.
  • 4 years of experience in cloud computing using Amazon Web Services or Google Cloud.

Benefits

  • Hybrid and remote work options.
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service