Data Engineer I

$85,300 - $85,300/Yr

Disney - Santa Monica, CA

posted about 2 months ago

Full-time - Entry Level
Santa Monica, CA
Motion Picture and Sound Recording Industries

About the position

At Disney Entertainment & ESPN Technology, we are dedicated to reimagining the way audiences experience our beloved stories. As a Data Engineer I, you will play a crucial role in the Product & Data Engineering team, which is responsible for the end-to-end development of Disney's consumer-facing products, including popular streaming platforms like Disney+, Hulu, and ESPN+. This position involves collaborating with various teams to design and build data pipelines that measure subscriber movements and metrics, ensuring that our products deliver exceptional experiences to millions of consumers worldwide. In this role, you will partner with both technical and non-technical colleagues to understand data and reporting requirements. You will work closely with engineering teams to collect necessary data from internal and external systems, contributing to the design of table structures and defining ETL pipelines that are reliable and scalable. Your responsibilities will also include developing and maintaining ETL routines using tools such as Airflow, implementing database deployments, and performing ad hoc analyses as needed. As a Data Engineer I, you will be expected to write code, complete programming tasks, conduct testing, and debug code to ensure the integrity and performance of our data solutions. You will also be involved in SQL and ETL tuning to optimize performance in a fast-growing data ecosystem. This position offers an exciting opportunity to innovate and shape the future of Disney's media business while working in a collaborative and dynamic environment.

Responsibilities

  • Partner with technical and non-technical colleagues to understand data and reporting requirements.
  • Work with engineering teams to collect required data from internal and external systems.
  • Contribute to the design of table structures and defining ETL pipelines to build performant Data solutions that are reliable and scalable in a fast growing data ecosystem.
  • Work with team on developing data quality checks, write code, complete programming, write tests, perform testing and debug code.
  • Develop and maintain ETL routines using ETL and orchestration tools such as Airflow.
  • Implement database deployments using tools like Schema Change.
  • Perform ad hoc analysis as necessary.
  • Perform SQL and ETL tuning as necessary.

Requirements

  • Good understanding of data modeling principles including Dimensional modeling, data normalization principles.
  • Good understanding of SQL Engines and able to conduct advanced performance tuning.
  • Ability to think strategically, analyze and interpret market and consumer information.
  • Strong communication skills - written and verbal presentations.
  • Strong conceptual and analytical reasoning competencies.
  • Comfortable working in a fast-paced and highly collaborative environment.
  • Familiarity with Agile Scrum principles and ceremonies.

Nice-to-haves

  • Experience implementing and reporting on business key performance indicators in data warehousing environments.
  • Knowledge of using analytic SQL, working with traditional relational databases and/or distributed systems (Snowflake or Redshift).
  • At least 1 year of experience programming languages (e.g. Python, Pyspark).
  • Experience with data orchestration/ETL tools (Airflow, Nifi).
  • Experience with Snowflake, Databricks/EMR/Spark, and/or Airflow.

Benefits

  • Medical insurance coverage
  • Financial benefits including bonuses and long-term incentive units
  • Flexible work arrangements
  • Professional development opportunities
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service