Northern Trust - Chicago, IL

posted about 2 months ago

Full-time - Senior
Chicago, IL
10,001+ employees
Real Estate

About the position

Northern Trust is seeking a dynamic Sr. Lead Cloud Engineer specializing in data platforms to support key data platforms including Snowflake, Databricks, and others. This role involves the design, implementation, and maintenance of data storage and processing solutions to ensure efficient data flow and accessibility. The successful candidate will collaborate with data engineers and analysts to support data-driven decision-making across the organization. This position is crucial for optimizing the performance and scalability of data platform environments to meet Northern Trust's data needs. The engineer will have a deep understanding of Snowflake's infrastructure and services, focusing on enhancing reliability, availability, serviceability, performance, and cost efficiency. Responsibilities include designing and developing database product applications, contributing to scaling initiatives, and implementing automation for account and database administration. The role also requires implementing data sharing and encryption policies to protect sensitive data, as well as proposing performance optimizations for database queries. Additionally, the engineer will develop monitoring architectures and work closely with Cloud leadership and application teams to gather requirements.

Responsibilities

  • Deep understanding of Snowflake's infrastructure and services to optimize reliability, availability, serviceability, performance, and cost efficiency.
  • Design, develop, and unit-test various modules of database product applications primarily on Snowflake and Databricks.
  • Contribute to initiatives aimed at scaling processes, systems, services, and automation.
  • Build data engineering and data ingestion modules for Snowflake.
  • Implement automation for administering accounts, databases, and access roles.
  • Implement Snowflake standards and methodologies for database security at both infrastructure and application levels.
  • Implement data sharing and replication for cross-cloud and cross-region for internal and external clients.
  • Implement data encryption, decryption, and masking policies to protect PII and PCI data.
  • Propose and implement database and Snowflake query performance optimizations to manage costs.
  • Build foundational database infrastructure automation to manage traffic and scale.
  • Create data structures and design data models tailored to Snowflake's architecture.
  • Demonstrate knowledge of cloud managed services, particularly Azure CLI.
  • Develop monitoring architecture and implement monitoring agents, dashboards, and alerts.
  • Work with Cloud leadership, Product Management, and Application Teams on requirements.
  • Develop infrastructure-as-code (IaC) automation using Terraform to create IaC patterns and modules.
  • Utilize modern software DevOps and CI/CD tooling to provision infrastructure resources and prevent configuration drift.
  • Investigate and resolve complex cloud infrastructure-related issues and recommend solutions.
  • Participate in IT Service Management (ITSM) change, incident, and general requests for cloud platform support.

Requirements

  • Bachelor's degree in computer science, engineering, or related technical fields.
  • 10+ years of technology experience.
  • 5+ years of cloud computing experience, specifically with Microsoft Azure.
  • 3 to 5+ years of experience with Snowflake.
  • Strong development skills in Python and associated libraries.
  • Experience with Terraform for Infrastructure as Code (IaC) automation.

Nice-to-haves

  • 5+ years of work experience in Snowflake data platform.
  • 5+ years hands-on experience with Python libraries such as Pandas, Flask, and Django.
  • Experience with Databricks data platform.
  • Proficiency in scripting languages: Python, Terraform (IaC), SQL, Java.
  • Strong hands-on experience in databases, preferably SQL.
  • Experience with data analytics, PySpark, and Big Data.
  • Strong debugging skills to identify issues and provide solutions.
  • Understanding of data structures, data modeling, and software architecture.
  • Experience creating data architectures and dealing with large volumes of data.
  • Experience using statistical languages to manipulate data and draw insights from data sets.
  • Experience with DevOps and CI/CD tools such as GitHub/Actions.
  • Ability to manage large, complex projects from conception to completion with minimal guidance.
  • Strong software engineering fundamentals, coding skills, and knowledge of SDLC.
  • Proficiency in various delivery methodologies including Agile, Scrum, Kanban, and SAFe.

Benefits

  • Flexible and collaborative work culture.
  • Opportunities for career growth and development.
  • Commitment to assisting the communities served.
  • Reasonable accommodations for individuals with disabilities.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service