There are still lots of open positions. Let's find the one that's right for you.
The Data Engineer with Spark & Scala role is focused on developing scalable and innovative data solutions using technologies such as Spark, Scala, and AWS Cloud. The position requires a strong understanding of data infrastructure, data modeling, and the ability to build end-to-end ETL pipelines. The role is based in Bloomfield, CT and requires on-site presence from day one.