This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

Infosys - Richardson, TX

posted about 2 months ago

Full-time - Mid Level
Richardson, TX
Professional, Scientific, and Technical Services

About the position

The Data Engineer position at Infosys focuses on core programming across multiple languages, emphasizing performance, quality, scalability, and extensibility. The role involves driving innovation within the chosen domain and collaborating with teams to develop enterprise-level software applications using an agile DevSecOps model. The full stack developer will be part of a small team leveraging new technologies to address challenges in both front-end and back-end architecture, ultimately enhancing user experiences for a global audience.

Responsibilities

  • Develop and implement software applications using multiple programming languages.
  • Drive innovation in the chosen domain and contribute to enterprise-level software solutions.
  • Collaborate with teams to build scalable and extensible applications following agile DevSecOps practices.
  • Work on both front-end and back-end architecture to deliver exceptional user experiences.
  • Categorize, catalog, cleanse, and normalize datasets for effective data management.
  • Provide user access to datasets through REST and Python APIs.
  • Perform extraction, transformation, and loading (ETL) of data from various sources using Python, SQL, and AWS technologies.

Requirements

  • Bachelor's degree or foreign equivalent from an accredited institution, or three years of progressive experience in the specialty in lieu of every year of education.
  • Minimum of 4 years of core development experience in relevant technology stacks.
  • Deep expertise in Scala or Python for Spark application development.
  • Strong knowledge and hands-on experience in SQL and Unix shell scripting.
  • Experience in end-to-end implementation of projects using Cloudera Hadoop, Spark, Hive, HBase, Sqoop, Kafka, Elasticsearch, Grafana, and ELK stack.

Nice-to-haves

  • Experience in data warehousing technologies and ETL/ELT implementations.
  • Sound knowledge of software engineering design patterns and practices.
  • Strong understanding of functional programming principles.
  • Experience with tools such as Ranger, Atlas, Tez, Hive LLAP, Neo4J, NiFi, and Airflow.
  • Knowledge of cloud and containerization technologies like Azure, Kubernetes, OpenShift, and Docker.
  • Experience with data visualization tools such as Tableau and Kibana.
  • Experience in designing and implementing ETL/ELT frameworks for complex data warehouses/marts.
  • Knowledge of large datasets and experience with performance tuning and troubleshooting.

Benefits

  • Competitive salary and performance bonuses.
  • Health insurance and wellness programs.
  • Opportunities for professional development and continued education.
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service