Ibotta - Denver, CO

posted 3 months ago

Full-time - Mid Level
Denver, CO

About the position

Ibotta is seeking a Senior Analytics Engineer to join our innovative Core Data & Analytics team and contribute to our mission to Make Every Purchase Rewarding. In this role, you will be responsible for utilizing the latest developments in ETL and ELT processes to provide the Revenue teams, including Client Partnership and Client Sales, with trusted and curated datasets. These datasets serve as the foundation for critical data products that deliver actionable insights. You will also take on a leadership role in cross-functional projects and mentor junior team members, fostering a collaborative work environment. This position is hybrid, requiring three days in the office (Tuesday, Wednesday, and Thursday) in Denver, Colorado. Candidates must reside in the United States. As a Senior Analytics Engineer, you will be expected to organize and optimize data for analysis, collaborate with other Analytics Engineers and Client Analytics peers to define requirements, and normalize datasets for business use. You will identify, validate, document, and conduct user acceptance testing (UAT) for event-based data across the Revenue team, ensuring that the datasets are reliable and maintainable. You will develop Revenue datasets with a focus on data quality principles, creating standards and change management processes for events used in analytics workflows. Your role will involve implementing engineering best practices to deploy and maintain quality curated datasets using tools like Airflow, including automated alerting and anomaly detection to ensure data integrity. You will work across our full technology stack, including Databricks, Spark, Command Line, Airflow, GitHub, Python, and Monte Carlo, to develop and maintain these datasets. Additionally, you will build UI and automation tools to enable data democratization throughout the company and lead large, complex task force projects that require cross-functional input.

Responsibilities

  • Organize and optimize data for analysis as an Ibotta Data Ninja.
  • Mentor junior Analytics Engineer team members.
  • Collaborate with other Analytics Engineers and Client Analytics peers to define requirements and normalize datasets for use.
  • Identify, validate, document, and conduct UAT for event-based data for business use across the Revenue team.
  • Understand and document Revenue team business logic to maintain datasets.
  • Develop Revenue datasets with data quality principles in mind, creating standards and change management principles for analytics workflows.
  • Implement engineering best practices to deploy and maintain quality curated datasets using Airflow, including automated alerting and anomaly detection.
  • Work across the full technology stack (Databricks, Spark, Command Line, Airflow, GitHub, Python, Monte Carlo) to develop and maintain datasets.
  • Build UI and automation tools to enable data democratization throughout the company.
  • Lead large, complex task force projects requiring cross-functional input.
  • Manage new data requirements and develop solutions that minimize technical debt.

Requirements

  • 5+ years of practical work experience in a data engineering role supporting an analytics team or equivalent experience as an analytics engineer.
  • Bachelor's degree in Computer Science, Engineering, Analytics, or a related field required.
  • Working knowledge and practical experience with ETL/ELT processes and tools (AWS Glue, DBT, etc.).
  • Experience with AWS Ecosystem and cloud-based data warehouse architecture.
  • Familiarity with Airflow, DataBricks, Git, and Monte Carlo.
  • Proficiency in multiple languages and frameworks (Python, Scala, strong SQL, Spark, Command Line).
  • Experience with modern BI/data visualization platforms (Looker, Tableau, etc.).
  • Understanding of event-driven architectures and platforms is a strong plus.
  • Ability to develop solutions applying data quality principles.
  • Experience identifying and troubleshooting data anomalies and pipeline issues.
  • Excellent oral and written communication skills.

Nice-to-haves

  • Experience with event-driven architectures and platforms.
  • Familiarity with data quality principles and practices.

Benefits

  • Competitive pay
  • Flexible time off
  • Medical, dental, and vision benefits
  • Lifestyle Spending Account
  • Employee Stock Purchase Program
  • 401k match
  • Paid parking in Denver office
  • Bagel Thursdays
  • Snacks and occasional meals.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service