Ibotta - Denver, CO

posted 8 days ago

Full-time - Mid Level
Denver, CO

About the position

Ibotta is seeking an Analytics Engineer, Feature Engineering to join the Core Data & Analytics team. This role focuses on collaborating with data scientists to enhance data features for machine learning models, ensuring high-quality analytics products and services across the organization. The position is hybrid, requiring three days in the office, and is based in Denver, Colorado.

Responsibilities

  • Collaborate with data scientists to identify and extract relevant features from raw data.
  • Preprocess and transform data to make it suitable for machine learning models.
  • Create new features by combining, modifying, or aggregating existing features.
  • Evaluate the quality of features and assess their impact on model performance.
  • Monitor feature drift and update features over time to maintain model performance.
  • Keep up with the latest advancements in feature engineering techniques and tools.
  • Implement and utilize engineering best practices and methods to deploy and maintain quality, curated data sets using Airflow, including automated alerting and anomaly detection into data flows to ensure data quality and integrity.
  • Optimize pipeline performance for real-time or near-real-time model execution.
  • Work across the full technology stack (Databricks, Spark, Command Line, Airflow, GitHub, Python, Monte Carlo, etc.) to develop and maintain datasets.
  • Manage new data requirements and develop solutions that minimize technical debt creation.

Requirements

  • 3+ years of practical work experience in data engineering, machine learning, or equivalent experience as an analytics engineer.
  • Bachelor's degree in Computer Science, Engineering, Analytics, or a related field required.
  • Working knowledge and some practical experience with end-to-end analytics automation, data pipelines, ETL/ELT processes, and tools (AWS Glue, DBT, etc.).
  • Experience with AWS Ecosystem and cloud-based data warehouse and architecture.
  • Familiarity with Airflow, DataBricks, Git, Monte Carlo, and multiple programming languages and frameworks (Python, Scala, strong SQL, Spark, Command Line) is highly preferred.
  • Development experience in a modern BI/data visualization platform (Looker, Tableau, etc.).
  • Exposure to event-driven architectures and platforms is a strong plus.
  • Ability to develop solutions by applying data quality principles.
  • Strong problem-solving skills and ability to think creatively to answer business questions using data.
  • Experience identifying and troubleshooting data anomalies and pipeline issues.

Nice-to-haves

  • Experience with managing and updating cluster configurations to ensure workflow operation.
  • Excellent oral and written communication skills.

Benefits

  • Competitive pay
  • Flexible time off
  • Medical, dental, and vision benefits
  • Lifestyle Spending Account
  • Employee Stock Purchase Program
  • 401k match
  • Paid parking
  • Bagel Thursdays
  • Snacks and occasional meals
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service