Data Resource Technologies - Lincoln, NE

posted about 2 months ago

Full-time - Mid Level
Lincoln, NE
Professional, Scientific, and Technical Services

About the position

The ETL / Data Integration Developer role is focused on developing and implementing ETL processes from various data sources. This position involves creating high-level conceptual and logical models, producing quality documentation, and participating in requirements gathering. The developer will also define and implement data transformation, quality checking, cleansing, and standardization processes, while testing for accuracy and performance. This role is part of a collaborative Agile team, requiring flexibility to perform various roles including analyst, designer, developer, tester, and mentor to meet customer needs effectively.

Responsibilities

  • Develop and implement ETL processes from various data sources.
  • Build high-level conceptual models and logical models.
  • Produce high-quality documentation for ETL processes.
  • Participate in requirements gathering and collect data definitions.
  • Define and implement data transformation, quality checking, cleansing, and standardization processes.
  • Test ETL processes for accuracy and performance.
  • Collaborate within an Agile team environment and perform various roles as needed.
  • Mentor team members and troubleshoot issues in ETL packages.

Requirements

  • Bachelor's degree in computer science or a related field or equivalent experience.
  • Minimum of five years of experience related to ETL and data integration functions.
  • Extensive experience in data migration projects utilizing ETL/ELT tools.
  • Strong expertise in database modeling techniques.
  • Proven track record in application database re-engineering and data migration from legacy databases.
  • Skilled in developing complex SQL Server PL/SQL, Stored Procedures, and ETL frameworks.
  • Strong grasp of data warehousing and cloud computing concepts.
  • Proficient with tools such as MuleSoft, Qlik, Talend, DataStage, AWS Glue, AWS Data Pipeline, and Azure Data Factory.
  • Hands-on experience with database programming, Azure DevOps, Azure SQL, SQL performance tuning, data modeling, and ETL job scheduling.

Nice-to-haves

  • Experience with the development and use of APIs for data sharing and consuming.
  • Knowledgeable with data infrastructure both on-premise and in the cloud, preferably in Azure or AWS.
  • Experience working in DevOps, Continuous Integration, and Continuous Delivery environments.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service