Global Atlantic Financial Group - Boston, MA

posted 4 days ago

Full-time - Mid Level
Boston, MA
Insurance Carriers and Related Activities

About the position

Global Atlantic Financial Company (a subsidiary of Global Atlantic Financial Group Limited) is seeking a candidate for the position of Associate, Data Engineer in Brighton, MA. Responsibilities include: Expand and optimize the data and data pipeline architecture, as well as optimize data flow and collection for cross functional teams. Perform data architecture analysis, design, development, and testing to deliver data applications, services, interfaces, extract, transform, and load (ETL) processes, reporting and other workflow and management initiatives. Work closely with the business, data analysts, and IT teams to support on data strategy initiatives and will ensure optimal data delivery architecture is consistent throughout the strategy. Follow modern Software Development Life Cycle (SDLC) principles, test driven development, source code reviews, and change control standards to maintain compliance with policies. Design and develop enterprise data and data architecture solutions using Java, J2EE, Hadoop and other data technologies like Spark, Scala, Python, and SQL. Assemble large, complex data sets that meet functional and non-functional business requirements. Create and maintain optimal data pipeline architecture. Devise and execute continual improvement initiatives in all Data Management Service delivery and technology, with a focus on delivery velocity and quality. Partner with business leaders to determine and prioritize delivery initiatives. Define or influence system, technical and application architectures for major areas of development. Devise and execute in software development life cycle including requirements gathering, development, testing, release management, and maintenance. Engage with business partners to report (formally and informally) on technology strengths, weaknesses, successes and challenges on a regular basis. Perform analytical programming in enterprise data warehouse (EDW) architecture to bridge the gap between a traditional database architecture and a Hadoop centric architecture. 40 hours/wk; Monday-Friday; 9am-5pm; $177,341 per annum.

Responsibilities

  • Expand and optimize the data and data pipeline architecture.
  • Optimize data flow and collection for cross functional teams.
  • Perform data architecture analysis, design, development, and testing.
  • Deliver data applications, services, interfaces, ETL processes, reporting and other workflow and management initiatives.
  • Work closely with business, data analysts, and IT teams to support data strategy initiatives.
  • Ensure optimal data delivery architecture is consistent throughout the strategy.
  • Follow modern Software Development Life Cycle (SDLC) principles.
  • Maintain compliance with policies through test driven development, source code reviews, and change control standards.
  • Design and develop enterprise data and data architecture solutions using Java, J2EE, Hadoop, Spark, Scala, Python, and SQL.
  • Assemble large, complex data sets that meet functional and non-functional business requirements.
  • Create and maintain optimal data pipeline architecture.
  • Devise and execute continual improvement initiatives in Data Management Service delivery and technology.
  • Partner with business leaders to determine and prioritize delivery initiatives.
  • Define or influence system, technical and application architectures for major areas of development.
  • Engage with business partners to report on technology strengths, weaknesses, successes and challenges.
  • Perform analytical programming in enterprise data warehouse (EDW) architecture.

Requirements

  • Bachelor's degree in Computer Science, Information Science, Computer or Electronics Engineering, or a related computational and quantitative analytical field.
  • Eight (8) years of experience as a member of an information technology team.
  • Three (3) years of experience in data modeling, complex data structures, data processing, and data quality and data lifecycle.
  • Working SQL knowledge and experience with relational databases and query authoring (SQL).
  • Performing root cause analysis on internal and external data and processes.
  • Performing UNIX shell scripting, batch scheduling, and version control tools.
  • Working in large scale server-side application development.
  • Ensuring appropriate change management and other technology methodologies.
  • Advanced working knowledge in Data Management, Business Intelligence (BI) Architecture, Product Development, RDBMS, and non-RDBMS platforms.
  • Leveraging analytical skills to recognize data patterns and troubleshoot data.
  • Designing and delivering data solutions for data migration initiatives, BI initiatives, and dashboards development.
  • Working in Insurance Domain.

Nice-to-haves

  • Utilizing Python, Core Java, Cloudera, AWS Technologies, Rest API, Microservices, Spring Boot, and Spring cloud.
  • Building and optimizing 'big data' data pipelines, architectures, and data sets.

Benefits

  • Highly competitive health, retirement, life and disability plans.
  • Customizable and comprehensive benefits package.
  • Community outreach and charitable giving programs.
  • Best in Perks & Benefits, Best Company Work-Life Balance, Best Company Happiness and Best Company Compensation.
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service