This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

DATAMAXISposted about 1 month ago
Resume Match Score

About the position

The ETL Developer is responsible for loading and extracting data into and out of the Teradata database. The ETL developer must learn and understand the technical structure of the data and the business use of the data. Knowing what the data is supposed to do and how end-users will use the data supports accurate loading, staging, transformation, and testing/troubleshooting of data. Our team is responsible for a wide range of data to support business needs. Our developers must work with their Lead to determine the steps (or plan) needed to implement the solution. Our projects range from small (8 – 10 hours), medium (300-1000 hours), or large (more than 1000 hours). The developer will work with the customer to define the requirements, create the technical design, code the solution, and implement the code into development, UAT, and production.

Responsibilities

  • Use Informatica PowerCenter to Extract, Transform, and Load (ETL) source data from agency sources, applying numerous transformations to the data that are required, applying corrections to the source files, and adding new fields via the change request process
  • Automate the staging of data by scheduling scripts to run at optimal times, enhancing overall system performance and efficiency
  • Create, maintain, and support an ETL/Informatica architectural and system environment for development and production
  • Perform domain/repository backups and recovery operations
  • Implementing ETL jobs by creating mappings, mapplets, sessions, and workflows
  • Working with different ETL transformations, including Aggregator, Expression, Filter, Router, Sequence Generator, Update Strategy, Joiner, Rank and Source Qualifier, Lookup, and Sorter
  • Scheduling and monitoring ETL/Informatica jobs in development, system testing, and production environment
  • Developing proactive processes for monitoring capacity and performance tuning
  • Supporting and advising the development teams on technical issues
  • Troubleshoot production support issues post-release deployment and comes up with solutions
  • Developing and maintaining complex SQL queries and scripts used to create data marts in Teradata
  • Participating in project discussions to document and recommend alternative technical solutions or requirements
  • Participating in unit, integration, and system testing
  • Maintaining effective communication and relationships with all individuals involved with the project
  • Learning new technologies as needed to be able to perform job duties effectively
  • Providing status progress and meeting deliverables to Optum’s leadership and Optum’s customer

Requirements

  • 7+ years of Informatica PowerCenter / ETL Experience
  • 5+ years of complex SQL and Performance Tuning Experience
  • 5+ years of Teradata Version 15+ experience, including Teradata Utilities (Fastload, TPT, Multiload)
  • 5+ years of ETL / Informatica PowerCenter administrative experience
  • Excellent verbal/written communication skills, end-client facing, team collaboration, and mentoring skills
  • Experience with Unix Shell scripting
  • Strong organizational skills and ability to set priorities and schedule work deadlines.

Nice-to-haves

  • Located in Springfield, IL area is preferred
  • 5+ years of Data Warehouse experience
  • 5+ years logical and physical data modeling experience using ERWIN or other data modeling tool
  • 2+ years of Azure Data Factory and Snowflake DB development
  • Experience in Windows, Unix/Linux scripting, R, React, or Python
  • Experience in Informatica Data Quality (IDQ), Address Doctor, Metadata Manager
  • Experience with State Medicaid / Medicare / Healthcare applications
  • Experience working in Agile
  • Experience with Data Warehouse technologies within the domain of ETL/data integration, Business Intelligence, relational and multi-dimensional database systems
  • Experience with Korn and Bash Shell scripting.
  • Bachelors or Advanced Degree in a related field such as Information Technology/Computer Science, Mathematics/Statistics, Analytics, Business Technology
  • Adapting to the team and organizational culture

Job Keywords

Hard Skills
  • Aggregator
  • Azure Data Factory
  • Bash
  • Informatica
  • Korn Shell
  • 1aYP3 NVqucvlmZ8G
  • 381xM0ztKaRw 5yQw8F6
  • 4l0f9
  • 4OLy
  • 6fOHiAn2Sb 9Orbzd2
  • 8gUFWi5 NzqVJvenXkCAm9 Luq6egm
  • 9NMzac
  • AOJuHPjfm1R8 JH0w21l
  • B204gTMJ cMwF9KCe
  • b9vGjXNB l0VjyCqourfc
  • bl7eI tsZ3Ukwu0lMHgyJ
  • bxKoXREG PmqtZB746WQG
  • CuXRShA
  • EaR3SdGjM 3yRMSdQOVvq
  • ElkpAv ju1qCRE
  • FPXYj5I
  • FWO9iGQ 0mdaPi9lQ
  • hiuyT Amaqk
  • i03Z
  • IMgdT
  • jhFa7 qMyXasRdh
  • jhuC8dyK1 zek1gUfFv
  • jL5pQ ubWlkCAU
  • jOIWpL
  • k2OS4vrTe cKSntfTM
  • L453IJP KYIjO
  • mlMb0ETwZe pVWOmLxzq H5mByrn
  • nyGBTC68ORSt
  • p5xvcPdiRD z5xw2lN4XBIgud
  • pYhAl bnzwNUlXFH
  • QWlrFsC 6uNdwvx1
  • T35hZb1anm9x LKcFbB9qv5D7
  • uLpW8 yjRBpG6305S
  • UpoYyadhMZ
  • w39StPgvdVzJGHU eDV5Omo
  • x3
  • XMl54Kro af6AsEwQWH 73hVH
  • xsQDm5BZ3 tcYiVR6FrS9Xp
  • YOteZAQa SZADuXUkGanTd
  • ZwFjGz
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service