New York Life - Lebanon, NJ

posted 5 days ago

Full-time - Mid Level
Lebanon, NJ
Insurance Carriers and Related Activities

About the position

The Data Engineer role within the Enterprise Data Management team at New York Life focuses on building, expanding, and optimizing data pipeline architecture using AWS cloud technologies. This position is crucial for supporting data needs across various applications and systems, enabling machine learning and data science capabilities, and ensuring efficient data integration and processing.

Responsibilities

  • Design and develop enterprise infrastructure and platforms required for data engineering.
  • Use AWS Cloud technologies to support data needs for expansion of Machine Learning/Data Science capabilities, applications/mobile apps/systems, BI/analytics, and cross-functional teams.
  • Create methods and routines to transition data from on-premise systems to the AWS Cloud.
  • Store data in AWS Cloud platform and develop transformational logic based on business rules.
  • Create and maintain optimal data pipeline architecture. Test and implement data environments.
  • Assemble large, complex data sets to meet functional/non-functional business requirements.
  • Build the infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using SQL and AWS big data technologies.
  • Utilize AWS Cloud tools like AWS DMS, AWS Glue, AWS S3, and AWS Lambda to ingest data sets and create pipelines.
  • Develop and debug code in PySpark/Spark/Python. Use SQL and/or Postgres.
  • Use Data Pipeline, workflow management, and orchestration service tools.
  • Collaborate with Cloud Data Architects, Data Engineers, DBAs, Analytics & Software Engineers.
  • Interact with stakeholders, Product Management, and Design teams to understand data pipeline requirements, design solutions, and resolve issues.

Requirements

  • 3+ years as a Cloud Data Engineer, Data Engineer, Data Architect, AWS Engineer, etc.
  • 3+ years with AWS services including RDS, DMS, S3 Data Lake, and Lambda in shared service, hybrid environments.
  • Experience with highly scalable Data stores, Data Lake, Data Warehouse, Lakehouse, and unstructured datasets.
  • Knowledge of data integration, data processing, data streaming, message queuing, and/or ETL/ELT.
  • Ability to lead a team of developers.
  • Bachelor's degree in Data Science, Computer Engineering, or related field preferred.

Benefits

  • Leave programs
  • Adoption assistance
  • Student loan repayment programs
  • Comprehensive benefit options
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service