Diverse Lynx - Mount Laurel Township, NJ

posted 2 months ago

Full-time
Mount Laurel Township, NJ
Professional, Scientific, and Technical Services

About the position

We are seeking a highly skilled Java Developer with expertise in Spark and Scala to join our team in Mount Laurel, NJ. This position requires a minimum of 7 years of experience in software development, particularly in big data technologies. The ideal candidate will have a strong background in Apache Spark, Spark SQL, and the broader Spark ecosystem, along with hands-on experience in Linux scripting. The role is primarily onsite, requiring the candidate to work five days a week in a collaborative environment. The successful candidate will be responsible for developing and maintaining data processing applications using Spark and Scala, ensuring high performance and responsiveness to requests from the front-end. You will work closely with cross-functional teams to design and implement scalable solutions for large-scale data processing. Your ability to communicate effectively and collaborate with team members will be crucial to the success of our projects. In addition to technical skills, familiarity with big data technologies such as Hadoop, HDFS, and HBASE is essential. Experience with tools for Continuous Integration/Continuous Delivery (CI/CD) such as GIT/BitBucket, Gradle, Jenkins, Jira, and Confluence will be beneficial. The role also requires a solid understanding of agile methodologies, as you will be working in an agile environment to deliver high-quality software solutions.

Responsibilities

  • Develop and maintain data processing applications using Apache Spark and Scala.
  • Collaborate with cross-functional teams to design scalable solutions for large-scale data processing.
  • Implement and optimize Spark SQL queries for performance.
  • Utilize big data technologies such as Hadoop, HDFS, and HBASE in application development.
  • Write and maintain Linux scripts for automation and data processing tasks.
  • Participate in the development of CI/CD pipelines using tools like GIT/BitBucket, Gradle, and Jenkins.

Requirements

  • 7+ years of experience in software development with a focus on Java, Spark, and Scala.
  • Strong knowledge of Apache Spark, Spark SQL, and related tools in the Spark ecosystem.
  • Experience with big data technologies such as Hadoop, HDFS, and HBASE.
  • Hands-on experience with Linux scripting.
  • Excellent communication and collaboration skills.
  • Familiarity with CI/CD tools such as GIT/BitBucket, Gradle, Jenkins, Jira, and Confluence.
  • Experience working in an agile development environment.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service