JPMorgan Chase - Wilmington, DE

posted 4 months ago

Full-time
Wilmington, DE
Credit Intermediation and Related Activities

About the position

Join us as we embark on a journey of collaboration and innovation, where your unique skills and talents will be valued and celebrated. Together we will create a brighter future and make a meaningful difference. As a Lead Data Engineer at JPMorgan Chase within Enterprise Technology, you are an integral part of an agile team that works to enhance, build, and deliver data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. As a core technical contributor, you are responsible for maintaining critical data pipelines and architectures across multiple technical areas within various business functions in support of the firm's business objectives. In this role, you will generate data models for your team using firmwide tooling, linear algebra, statistics, and geometrical algorithms. You will deliver data collection, storage, access, and analytics data platform solutions in a secure, stable, and scalable way. Additionally, you will implement database back-up, recovery, and archiving strategies, and evaluate and report on access control processes to determine the effectiveness of data asset security with minimal supervision. You will also contribute to the team culture of diversity, equity, inclusion, and respect.

Responsibilities

  • Generates data models for their team using firmwide tooling, linear algebra, statistics, and geometrical algorithms.
  • Delivers data collection, storage, access, and analytics data platform solutions in a secure, stable, and scalable way.
  • Implements database back-up, recovery, and archiving strategy.
  • Evaluates and reports on access control processes to determine effectiveness of data asset security with minimal supervision.
  • Adds to team culture of diversity, equity, inclusion, and respect.

Requirements

  • Formal training or certification on data engineering concepts and 5 years applied experience.
  • Extensive ETL experience required, this could be across various products.
  • Working experience with both relational and NoSQL databases.
  • Experience and proficiency across the data lifecycle.
  • Experience with database back-up, recovery, and archiving strategy.
  • Proficient knowledge of linear algebra, statistics, and geometrical algorithms.
  • Proficient data modeling skill required.

Nice-to-haves

  • Data Engineering experience using products such as Databricks, Snowflake, Spark, and/or other cloud-based databases.
  • Data Engineering using cloud-based LLM tools a plus.
  • Extensive Business Intelligence experience a plus.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service