Infosys - Austin, TX

posted 6 months ago

Full-time
Austin, TX
Professional, Scientific, and Technical Services

About the position

In the role of Spark Developer with Scala/Python, you will interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle (SDLC) including Requirements Elicitation, Application Architecture definition, and Design. This position is crucial as you will be responsible for creating high-level design artifacts that guide the development process. You will also be tasked with delivering high-quality code deliverables for a module, leading validation for all types of testing, and supporting activities related to implementation, transition, and warranty. As a Spark Developer, you will be part of a collaborative learning culture where teamwork is encouraged, excellence is rewarded, and diversity is respected and valued. Your role will involve working closely with various teams to ensure that the software solutions meet the needs of the business and adhere to best practices in software development. You will be expected to stay updated with the latest technologies and methodologies in the field to continuously improve the quality of your work and the efficiency of the development process.

Responsibilities

  • Interface with key stakeholders to gather requirements and define application architecture.
  • Create high-level design artifacts to guide the development process.
  • Deliver high-quality code deliverables for assigned modules.
  • Lead validation for all types of testing including unit, integration, and system testing.
  • Support implementation, transition, and warranty activities.
  • Collaborate with team members to ensure effective communication and project execution.

Requirements

  • Bachelor's degree or foreign equivalent from an accredited institution, or three years of progressive experience in the specialty in lieu of every year of education.
  • At least 4 years of experience in Information Technology.
  • At least 3 years of hands-on experience with Hadoop distributed frameworks while handling large amounts of big data using Spark and Hadoop Ecosystems.
  • At least 2 years of experience with Spark.
  • At least 2 years of experience with Scala/Python/Java.

Nice-to-haves

  • At least 1 year of AWS development experience.
  • At least 1 year of hands-on experience with Scala.
  • Ability to work within deadlines and effectively prioritize and execute on tasks.
  • Strong communication skills (verbal and written) with the ability to communicate across teams, internal and external at all levels.
  • Experience in driving automations.
  • DevOps knowledge is an added advantage.
  • Proficient knowledge of SQL with any RDBMS.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service