Orb Enterprises - New York, NY

posted 3 months ago

Full-time
New York, NY
Computer and Electronic Product Manufacturing

About the position

The Scala Spark Developer with Java role is a critical position focused on developing and maintaining data processing applications using Spark and Scala. The successful candidate will work onsite in Broadway, NY, three days a week, collaborating with cross-functional teams to understand data requirements and design efficient solutions. This role requires a strong foundation in Big Data technologies, particularly in Scala and Spark, as well as Core Java. The developer will implement test-driven deployment practices to enhance the reliability of applications and will be responsible for deploying artifacts from lower to higher environments, ensuring a smooth transition throughout the deployment process. In addition to development tasks, the Scala Spark Developer will troubleshoot and debug Spark performance issues to ensure optimal data processing. Working in an agile environment, the developer will contribute to sprint planning, development, and the timely delivery of high-quality solutions. The role also involves providing essential support for production batches, addressing issues, and implementing fixes to meet critical business needs. This position is ideal for someone who thrives in a collaborative environment and is passionate about leveraging technology to solve complex data challenges.

Responsibilities

  • Develop and maintain data processing applications using Spark and Scala.
  • Collaborate with cross-functional teams to understand data requirements and design efficient solutions.
  • Implement test-driven deployment practices to enhance the reliability of applications.
  • Deploy artifacts from lower to higher environments, ensuring a smooth transition.
  • Troubleshoot and debug Spark performance issues to ensure optimal data processing.
  • Work in an agile environment, contributing to sprint planning, development, and delivering high-quality solutions on time.
  • Provide essential support for production batches, addressing issues and providing fixes to meet critical business needs.

Requirements

  • 5+ years of experience in Big Data technologies.
  • Strong knowledge of Scala programming language.
  • Proficiency in Spark, including the development and optimization of Spark applications.
  • Excellent problem-solving and analytical skills.
  • Ability to troubleshoot and debug performance issues in Spark.
  • Understanding of design patterns and data structures for efficient data processing.
  • Familiarity with database concepts and SQL.

Nice-to-haves

  • Java
  • Snowflake
  • Test-driven deployment practices
  • Python
  • Databricks
  • Understanding of DevOps practices.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service