Infosys - Richardson, TX

posted 2 months ago

Full-time
Richardson, TX
Professional, Scientific, and Technical Services

About the position

Infosys is seeking an Azure and Databricks Data Engineer to enable digital transformation for our clients through a global delivery model. In this role, you will be responsible for researching technologies independently, recommending appropriate solutions, and contributing to technology-specific best practices and standards. You will interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle (SDLC). This position is designed for individuals who thrive in a learning culture where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued. As an Azure and Databricks Data Engineer, you will play a crucial role in the end-to-end implementation of projects utilizing Azure, Databricks, Python, PySpark, and SQL. Your strong knowledge and hands-on experience in SQL and Unix shell scripting will be essential in delivering high-quality solutions. You will also be expected to engage with various stakeholders, ensuring that the technical solutions align with business needs and objectives. This role may require travel within the United States, and candidates must be located within commuting distance of Richardson, TX, or be willing to relocate to the area.

Responsibilities

  • Enable digital transformation for clients through a global delivery model.
  • Research technologies independently and recommend appropriate solutions.
  • Contribute to technology-specific best practices and standards.
  • Interface with key stakeholders throughout the Software Development Life Cycle.
  • Implement projects end-to-end using Azure, Databricks, Python, PySpark, and SQL.
  • Apply technical proficiency in SQL and Unix shell scripting.

Requirements

  • Bachelor's degree or foreign equivalent from an accredited institution, or three years of progressive experience in the specialty in lieu of every year of education.
  • At least 4 years of Information Technology experience.
  • Experience in end-to-end implementation of projects in Azure, Databricks, Python, PySpark, and SQL.
  • Strong knowledge and hands-on experience in SQL and Unix shell scripting.

Nice-to-haves

  • Sound knowledge of software engineering design patterns and practices.
  • Strong knowledge of data structures, data engineering concepts, algorithms, collections, multi-threading, memory management, and concurrency.
  • Experience in large scale cloud data migrations using Databricks, Python, Spark, and SQL.
  • Good understanding of Agile software development frameworks.
  • Experience in the Banking/Financial domain.
  • Good to have experience in Snowflake.
  • Strong communication and analytical skills.
  • Ability to work in teams in a diverse, multi-stakeholder environment.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service