Request Technology - Chicago, IL

posted 4 days ago

Full-time
Chicago, IL
Administrative and Support Services

About the position

The Metadata Data Lineage Analyst will be responsible for developing Metadata and Data Lineage Solutions for various data sources across both On-Prem and Cloud environments. This role involves working closely with technical subject matter experts and developers to understand application designs and create comprehensive data flow diagrams and mappings. The analyst will also automate metadata extraction processes and ensure compliance with data governance standards.

Responsibilities

  • Work with Technical SMEs/developers to understand applications/systems design and create data flow diagrams/data mappings.
  • Create Source to Target mapping documents by reverse engineering application Java code, BI tools, and SQL queries for identified data flows.
  • Develop custom metadata connectors/scanners using programming tools to automate metadata extraction from disparate data sources.
  • Develop solutions to automate metadata extraction and create data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools.
  • Manage metadata, administration, support, and ingest data management assets using extension mappings and custom data assets.
  • Create, develop, configure, and execute end-to-end business and technical data lineage across disparate sources in accordance with Data Governance Standards, Policies, and Procedures.
  • Design and build data capabilities such as data quality, metadata, data catalog, and data dictionary.

Requirements

  • 6 or more years of data analysis experience with a robust understanding of metadata, data flows, and mappings.
  • Ability to understand Java code; read and/or write code using programming languages such as Java or Python.
  • Proficient with SQL and experience working with Git and data analysis using Python/Pyspark.
  • Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, and Kafka Streams.
  • Experience with various types of databases including Relational, NoSQL, and Object-based databases.
  • Ability to review application development code to ensure it meets functional requirements and architectural standards.
  • Proficiency in writing technical documentation for Java-based applications that process data in real-time and batch.
  • Ability to develop and implement Data Quality Solutions for multiple data sources across On-Prem and Cloud environments.

Nice-to-haves

  • Experience working on Protobuf, APIs, and Kafka as Data Sources.
  • Experience with draw.io or other tools for creating architecture or data flow diagrams.
  • Experience in object-oriented design and software design patterns.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service