Cloudzero - Boston, MA

posted 19 days ago

Full-time - Senior
Boston, MA
Computing Infrastructure Providers, Data Processing, Web Hosting, and Related Services

About the position

The Data Engineering Lead will be responsible for building and scaling the data infrastructure within the newly formed Research Group under the Office of the CTO. This role focuses on creating clean, reliable, and organized data models to support data-driven decision-making across the organization. The lead will also manage a team of data engineers and collaborate with AI/ML data scientists to provide high-quality data for various initiatives, utilizing modern data engineering tools such as Snowflake and Looker.

Responsibilities

  • Develop and implement scalable, reliable, and secure data pipelines and architectures using Snowflake and other modern technologies.
  • Lead the data engineering team in building and maintaining ETL/ELT workflows, ensuring high data quality and performance.
  • Continuously optimize Snowflake queries, data models, and storage to meet business needs and reduce costs.
  • Collaborate with analytics and business intelligence teams to integrate Looker and other tools into the data ecosystem.
  • Define and enforce data governance policies, ensuring compliance with security and privacy regulations.
  • Provide technical mentorship to data engineers and analysts, fostering a culture of learning and innovation.
  • Partner with product managers, analysts, and stakeholders to understand business needs and translate them into technical solutions.

Requirements

  • 12+ years of experience in data engineering, with at least 3 years in a lead or principal role.
  • Deep expertise in Snowflake architecture, performance tuning, and cost optimization.
  • Strong proficiency in SQL and data modeling best practices.
  • Experience with Looker and other similar BI platforms (ex. Tableau, PowerBI, Sigma).
  • Hands-on experience with ETL/ELT tools such as dbt, Matillion, or Airflow.
  • Familiarity with cloud platforms like AWS, Azure, or GCP.
  • Proven ability to design and implement scalable data pipelines in large-scale environments.
  • Strong communication skills and ability to work effectively with both technical and non-technical stakeholders.
  • Strong understanding of data governance, security, and compliance requirements (e.g., GDPR, HIPAA).
  • Programming skills in Python or another scripting language.
  • Experience with real-time data processing technologies (e.g., Kafka, Spark).
  • Knowledge of machine learning pipelines and data science workflows.
  • Certifications in Snowflake, Looker, or cloud platforms.
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service