This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

Rockland Trust Career Siteposted 16 days ago
Senior
Plymouth, MA
Resume Match Score

About the position

We are seeking an experienced Senior Data Engineer to join our team and help build and maintain our enterprise data lakehouse. The ideal candidate will have expertise in Azure Data Factory (ADF), Apache Airflow, dbt, and implementing medallion-style data architectures inside modern data warehouse platforms such as BigQuery, Snowflake, and Redshift.

Responsibilities

  • Design, implement, and maintain scalable data pipelines using ADF and dbt
  • Develop and optimize ELT processes within a medallion architecture (Bronze, Silver, Gold, Semantics layers)
  • Collaborate with data governor, analysts, and other stakeholders to understand data requirements and deliver high-quality datasets
  • Implement data quality checks and monitoring throughout the data lifecycle
  • Optimize query performance and data models for efficient analytics
  • Contribute to data governance and documentation efforts
  • Design and implement analytical models to enhance data insights and automation
  • Integrate analytical models into existing data pipelines and workflows
  • Stay updated with the latest AI and machine learning technologies and best practices

Requirements

  • Bachelor's degree in Computer Science, Engineering, or related field
  • 5+ years of experience as a Data Engineer
  • Strong proficiency in SQL and Python
  • Hands-on experience with Azure Data Factory, AWS Glue or Apache Airflow for workflow orchestration
  • Expertise in using dbt for data transformation and modeling
  • Experience implementing medallion architecture or similar multi-layer data architectures
  • Familiarity with cloud data platforms (e.g., BigQuery, Snowflake, or Redshift)
  • Knowledge of data warehousing concepts and dimensional modeling
  • Experience with developing ML and statistical models
  • Strong problem-solving skills and attention to detail
  • Excellent communication skills and ability to work in a collaborative environment

Nice-to-haves

  • Experience with Delta Lake or similar data lakehouse technologies
  • Familiarity with data science and machine learning concepts
  • Knowledge of data governance and compliance requirements
  • Experience with CI/CD practices for data pipelines
  • Experience dealing with data at financial institutions/banks
  • Experience with FIS IBS core banking system
  • Familiarity with Kafka, Kinesis or similar data streaming service
  • Familiarity with microservices based and event driven architecture
  • Experience with efficient code development and debugging using gen-ai tools like Github co-pilot
  • Understanding of MLOps practices and tools

Job Keywords

Hard Skills
  • AWS Glue
  • Azure Data Factory
  • Github
  • Python
  • SQL
  • 3clk6ZHP
  • 7ru45 kWMhq8HG4g
  • 9l5cE yKXlqmCwxTaS
  • 9WujX iL2vnEXaAj0RCqO
  • brvfH5
  • c3eqT Lm6voUQ37g
  • C8atMGnwkL r41nLCpXZqkE
  • cD0G9 gTuDaK3i6k
  • CTNIY vBjIbHuzp0y95
  • D5w7McUVlbu9 ymqIthvZY
  • g8QmH RijJndvVsMGD1
  • Gu5sQL46T hMfR7WpD
  • hfO0t 5iECjLI3
  • JES0D fNGcYedq
  • Kb3GU txMkeNcEIi
  • kbD8I CPEqTeKpDW
  • KdO7c G9hjYKoXptg
  • LYmnxr8IAs
  • O4rDLaqdGy8 HVKhCqkxOItUl
  • P8f6h1Qk JWuMSaek1
  • qO7vhfZpkc60 RlcT3DIOd
  • Ssp6NH135FX 2zbYSaohtnA
  • vCVz3 7bDQRj0JXoqP
  • vPph6Dgs z3miFoM9
  • x4VjBi1G 6JpS5tebT
  • YjPgE lVUcFfh5
  • ziaRqd kANcJmUVqF0D
  • zWrCH nIjmwlFzcN
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service