Unclassified - Sunnyvale, CA

posted about 2 months ago

Full-time - Mid Level
Remote - Sunnyvale, CA
5,001-10,000 employees

About the position

LinkedIn is seeking an experienced Data Engineer to enhance the efficiency and quality of data processing within the LinkedIn Marketing Solutions (LMS) team. This role focuses on developing and managing data pipelines, ensuring data accuracy, and collaborating with engineering teams to optimize data operations. The position offers a hybrid work option, allowing flexibility in work location while fostering a collaborative and innovative environment.

Responsibilities

  • Design, develop, and manage data pipelines and workflows for efficient data processing using Trino SQL/Spark SQL in HDFS datasets.
  • Perform code designs and review/approve test cases.
  • Implement data quality checks and audits to maintain high data accuracy and integrity.
  • Produce elegant and efficient designs, high performance, and scalable code for future needs.
  • Collaborate with cross-functional teams to understand data requirements and implement robust data solutions.
  • Gather data requirements from domain experts and translate business needs into technical specifications.
  • Act as a technical advisor and Subject Matter Expert for projects, providing advice on process and design.
  • Optimize data storage for performance and scalability, ensuring efficient ETL processes.
  • Develop and maintain documentation related to data pipelines, QA, metrics, and data policy.
  • Stay updated with industry best practices and emerging trends in data engineering and analytics.

Requirements

  • Bachelor’s degree in engineering, computer science, data science, or a related technical field.
  • 6+ years of experience in a data engineering or analytics engineering role.
  • 2+ years of experience using SQL and optimizing SQL databases for performance (Trino SQL or Spark).
  • Experience in managing data pipelines (like HDFS), data repository (like GitHub), workflows (like Apache Airflow), and ETL best practices.
  • Ability to communicate complex technical concepts to both technical and non-technical individuals.
  • Experience working with multiple stakeholders, setting project priorities, and delivering on Objectives and Key Results (OKRs).
  • Experience automating script changes in Python.

Nice-to-haves

  • Masters in engineering, computer science, or related technical field (such as statistics or data science).
  • Excellent analytical skills in designing data workflows and analyzing data for anomalies.
  • Familiarity with data governance principles.
  • Program Manager experience.
  • Experience running a scrum team and using Jira.

Benefits

  • Annual performance bonus
  • Stock options
  • Health insurance
  • Dental insurance
  • Vision insurance
  • 401(k) plan with matching contributions
  • Flexible work hours
  • Paid time off
  • Tuition reimbursement
  • Professional development opportunities
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service