Infosys - Indianapolis, IN

posted 5 months ago

Full-time - Mid Level
Indianapolis, IN
Professional, Scientific, and Technical Services

About the position

Infosys is seeking an Azure Data Engineer Lead with a strong background in designing and building data integration and data quality pipelines to create data for end user consumption. The ideal candidate will oversee the creation of scalable data pipelines, design data architectures, and ensure efficient ETL processes using services like Azure Synapse, Azure Data Factory, and Azure Databricks. In this role, you will be responsible for ensuring data security, compliance, and optimizing performance while collaborating with stakeholders to meet data requirements. You will also maintain technical documentation and implement continuous improvement practices in application development and operations. The candidate will utilize their understanding of the problem to arrive at multiple solution alternatives, keeping in mind the various stakeholders, and assess the pros and cons of all the alternatives to arrive at the optimal solution. This position requires a proactive approach to problem-solving and the ability to communicate effectively with both technical and non-technical stakeholders. As part of the team, you will be expected to lead initiatives that enhance the data engineering capabilities of the organization, ensuring that the data pipelines are robust, scalable, and efficient. You will also play a key role in mentoring junior team members and fostering a culture of continuous learning and improvement within the team.

Responsibilities

  • Oversee the creation of scalable data pipelines and design data architectures.
  • Ensure efficient ETL processes using Azure Synapse, Azure Data Factory, and Azure Databricks.
  • Ensure data security and compliance while optimizing performance.
  • Collaborate with stakeholders to meet data requirements.
  • Maintain technical documentation and implement continuous improvement practices in application development and operations.
  • Utilize understanding of problems to arrive at multiple solution alternatives and assess the pros and cons of each alternative.

Requirements

  • Bachelor's degree or foreign equivalent required from an accredited institution, or three years of progressive experience in the specialty in lieu of every year of education.
  • At least 4 years of experience in Information Technology.
  • At least 3 years of hands-on experience in using Azure Data Factory (ADF) to orchestrate data ingestion from various sources into Azure Synapse and Azure Data Lake Storage (ADLS).
  • At least 2 years of experience designing and implementing data pipelines using Azure Databricks for data cleaning, transformation, and loading into Azure Synapse Analytics.
  • At least 2 years of experience working in Support and Maintenance.
  • Well versed with Jira and ServiceNow, and understand the ITIL process well.
  • Good understanding of Change Management, ticketing process, L2 and L3 work.
  • Experience working in Onshore-Offshore model and team leading experience with ability to play a Techno Functional role.

Nice-to-haves

  • Python knowledge is preferred.
  • Basic knowledge of PBI would also be a plus.
  • Experience in software engineering and data engineering roles, with a focus on Azure and Databricks.
  • Strong understanding of databases and big data software technologies, specifically Azure and Databricks.
  • Experience with Agile methodologies, particularly Scrum.
  • Experience in coordinating with offshore data engineering teams.
  • Strong problem-solving skills related to data, data structures, and algorithms.
  • Excellent communication and decision-making skills.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service