Epsilon - Chicago, IL

posted 4 months ago

Full-time - Senior
Chicago, IL
Professional, Scientific, and Technical Services

About the position

We are seeking a Senior Data Solutions Engineer to join our dynamic team in Chicago. The ideal candidate will bring extensive experience in big data and data warehousing, combined with a passion for developing innovative data solutions. You will play a crucial role in building and maintaining our MPP, HDFS, and Elastic platforms, which capture and analyze data from our Ad Stack and external sources. In this role, you will be responsible for developing and optimizing complex SQL queries and Python scripts to ingest, transform, and analyze large datasets, ensuring high data quality and efficient processing. You will also manage and enhance data quality services, leveraging Python to continuously improve our systems and data quality indicators. As a Senior Data Solutions Engineer, you will provide expert-level insights into SQL, Python, and Linux commands, utilizing AWS Cloud tools and Databricks to support our data solutions. Collaboration is key in this role, as you will work closely with client integration engineers, account managers, analysts, and other engineers to deliver effective data-oriented solutions. Staying updated on emerging technologies and industry trends will be essential, as you will influence technical decisions and contribute to our evolving platform. Additionally, you will be tasked with identifying and resolving production issues, optimizing performance, and addressing bottlenecks in data processing. Clear communication of technical concepts and solutions to internal teams and stakeholders will foster a collaborative environment and ensure project success.

Responsibilities

  • Develop and optimize complex SQL queries and Python scripts to ingest, transform, and analyze large datasets.
  • Ensure high data quality and efficient processing.
  • Manage and enhance data quality services, leveraging Python to continuously improve our systems and data quality indicators.
  • Provide expert-level insights into SQL, Python, and Linux commands.
  • Use AWS Cloud tools and Databricks to support our data solutions.
  • Work closely with client integration engineers, account managers, analysts, and other engineers to deliver effective data-oriented solutions.
  • Stay updated on emerging technologies and industry trends, influencing technical decisions and contributing to our evolving platform.
  • Identify and resolve production issues, optimize performance, and address bottlenecks in data processing.
  • Clearly articulate technical concepts and solutions to internal teams and stakeholders.

Requirements

  • 6-8 years of experience in data development, with a strong background in SQL, Python, and big data technologies.
  • Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent professional experience.
  • Proficient in SQL and Python; experience with Linux command line; familiarity with AWS Cloud tools and Databricks.
  • Understanding of distributed systems, relational databases, and data warehouse architecture.
  • Experience with Kafka, Flume, Spark, Scala, Java, NoSQL databases (HBase, Cassandra, ScyllaDB), MPP RDBMS, Postgres, Hadoop, AirFlow, Docker, Kubernetes, and Elastic is a plus.
  • Excellent communication and problem-solving skills, with the ability to thrive in a collaborative team environment.

Nice-to-haves

  • Familiarity with the internet/digital advertising ecosystem is a plus.

Benefits

  • Flexible time off (FTO)
  • 14 paid holidays
  • Paid sick time
  • Parental/new child leave
  • Childcare & elder care assistance
  • Adoption assistance
  • Comprehensive health coverage
  • 401(k)
  • Tuition assistance
  • Commuter benefits
  • Professional development
  • Employee recognition
  • Charitable donation matching
  • Health coaching and counseling
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service