Data Engineering Group Manager

$176,720 - $265,080/Yr

Citigroup - Jersey City, NJ

posted 3 months ago

Full-time - Senior
Jersey City, NJ
10,001+ employees
Credit Intermediation and Related Activities

About the position

Citi, a leading global bank, is seeking a Data Engineering Group Manager for Core Accounts to spearhead the modernization and optimization of data distribution flows and integration with various data systems. This role is pivotal in developing engineering solutions that aim to reduce latency, enhance flexibility, and minimize technical debt within the organization. The Data Engineering Group Manager will work closely with global teams to support a franchise-critical business and collaborate with Citi Technology Infrastructure teams, STE Standard Operating Environments, and application teams to evolve the data highways across Services. The responsibilities of this position include re-engineering the interaction of incoming and outgoing data flows from the Core Accounts Demand Deposit Account (DDA) platform to Reference Data platforms, Data Warehouses, Data Lakes, and other local reporting systems. The manager will drive the data architecture and roadmap to eliminate non-strategic point-to-point connections and batch handoffs, define canonical data models for key entities and events related to Customer, Account, and Core DDA, and assess opportunities to simplify and rationalize existing database schemas. Additionally, the role involves rationalizing over 100 handoffs from DDA applications running on Abinitio and Talend across more than 70 countries and defining a common taxonomy. The Data Engineering Group Manager will also design an Operational Data Store for intra-day and end-of-day reporting, implement data strategies, and develop logical and physical data models. They will create self-service reporting capabilities for Operations using the latest tools and technologies, formulate efficient approaches to rationalize and migrate thousands of reports to new infrastructure, and build and nurture a strong engineering organization to deliver value to both internal and external clients. This position represents an incredible opportunity to be part of a worldwide transformation towards a future state global digital banking platform.

Responsibilities

  • Re-engineer the interaction of incoming and outgoing data flows from the Core Accounts DDA platform to Reference Data platforms, Data Warehouse, Data Lake, and local reporting systems.
  • Drive the data architecture and roadmap for eliminating non-strategic point-to-point connections and batch handoffs.
  • Define canonical data models for key entities and events related to Customer, Account, and Core DDA in line with Data Standards.
  • Assess opportunities to simplify, rationalize, and refactor existing database schemas to pave the way for modularization of the existing stack.
  • Rationalize 100+ handoffs from DDA applications running on Abinitio and Talend in 70+ countries and define a common taxonomy.
  • Design an Operational Data Store for intra-day and end-of-day reporting.
  • Implement data strategies and develop logical and physical data models.
  • Design self-service reporting capabilities for Operations using the latest tools and technologies.
  • Formulate efficient approaches to rationalize and migrate thousands of reports to new infrastructure.
  • Build and nurture a strong engineering organization to deliver value to internal and external clients.

Requirements

  • Significant experience in Data modeling, Data lineage analysis, and Operational reporting, preferably in a global organization.
  • Proven architecture experience in building horizontally scalable, highly available, and highly resilient data distribution platforms.
  • Proficient in message queuing, stream processing, and highly scalable ‘big data' data stores.
  • Advanced working SQL knowledge and experience with relational databases, query authoring (SQL), and familiarity with various databases.
  • Experience building and optimizing ‘big data' data pipelines, architectures, and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Extensive experience with Data Integration patterns.
  • Extensive experience with Real/Near Real-time streaming patterns.

Nice-to-haves

  • Familiarity with big data platforms such as Hadoop and Apache Kafka.
  • Experience with relational SQL, NoSQL, and Cloud Native databases like Postgres, Cassandra, and Snowflake.
  • Experience with data pipeline and orchestration tools such as Azkaban, Luigi, or Airflow.
  • Experience with AWS cloud services including EMR, RDS, and Redshift.
  • Experience with stream-processing engines like Apache Spark, Apache Storm, or Apache Flink.
  • Experience with ETL tools such as Talend and Ab Initio.
  • Experience with Data Analytics/visualization tools like Looker, Mode, or Tableau.

Benefits

  • Medical, dental & vision coverage
  • 401(k)
  • Life, accident, and disability insurance
  • Wellness programs
  • Paid time off packages including planned time off (vacation), unplanned time off (sick leave), and paid holidays.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service