Nityo Infotech - Scottsdale, AZ

posted 3 months ago

Full-time - Mid Level
Scottsdale, AZ
Professional, Scientific, and Technical Services

About the position

The QA Engineer position focuses on ensuring the quality and performance of applications within an AWS environment, utilizing Java and Groovy for automation testing. The ideal candidate will have a strong background in IT, specifically with 6-8 years of experience in testing and automation. This role requires expertise in various technologies, including AWS services such as S3 and Glue, as well as experience with Databricks, structured streaming, and Delta Lake concepts. The candidate will be responsible for architecting and leading enterprise-wide initiatives in automation testing, particularly for ETL processes and data migration. The role demands a deep understanding of streaming data pipelines, ETL/ELT processes, and the tools associated with them. Candidates should be familiar with data quality frameworks like Great Expectations and possess advanced SQL skills, including proficiency in joins, aggregation, and performance optimization techniques. The QA Engineer will also need to debug and troubleshoot complex technical issues, ensuring the deployment and testing of large-scale, high-performance enterprise big data applications. Additionally, experience with AWS architecture, Lambda functions, and CI/CD pipelines using GitLab is essential. The candidate should be able to communicate effectively with technology partners and stakeholders, providing insights into technology solutions and their implications. This position requires a collaborative mindset, as the QA Engineer will thrive in a team-based environment, contributing to the overall success of the projects.

Responsibilities

  • Ensure the quality and performance of applications in an AWS environment.
  • Lead and architect enterprise-wide initiatives in automation testing for ETL processes.
  • Debug, troubleshoot, and implement solutions to complex technical issues.
  • Develop and maintain automation testing frameworks using Java and Groovy.
  • Collaborate with team members to optimize data migration and transformation processes.
  • Communicate technology solutions and constraints to stakeholders and senior management.
  • Utilize advanced SQL for data validation and performance optimization.
  • Work with AWS services, including S3 and Glue, to support data processing at scale.
  • Implement CI/CD pipelines using GitLab and maintain related documentation.
  • Familiarize with data quality frameworks and ensure data integrity.

Requirements

  • 6-8 years of IT experience focusing on Testing and Automation.
  • Proficiency in Java and Python programming languages.
  • Experience with AWS services, specifically S3 and Glue.
  • Strong understanding of Databricks, structured streaming, and Delta Lake concepts.
  • Experience with Spark Scala and Java programming.
  • Advanced SQL skills, including joins, aggregation, and windowing functions.
  • Experience leading automation testing initiatives for ETL processes.
  • Understanding of streaming data pipelines and their differences from batch systems.
  • Familiarity with ETL/ELT tools and processes.
  • Ability to debug and troubleshoot complex technical issues.

Nice-to-haves

  • Familiarity with Great Expectations or other data quality/data validation frameworks.
  • Experience with AWS Lambda for data processing and optimization.
  • Knowledge of Schema Registry and message formats such as Avro and ORC.
  • Experience with architecture in AWS environments.

Benefits

  • Health insurance coverage
  • 401k retirement savings plan
  • Paid holidays and vacation time
  • Professional development opportunities
  • Flexible scheduling options
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service