American National Insurance - Springfield, MO

posted 3 months ago

Full-time - Mid Level
Springfield, MO
Insurance Carriers and Related Activities

About the position

American National is seeking a Sr Business Intelligence Data Engineer for our League City, TX location. The Sr Business Intelligence Data Engineer is responsible for the delivery of the corporate data warehouse and the overall execution of the data integration solutions. This role is crucial in supporting strategic, tactical, and operational data warehousing to ensure that data is acquired from the correct source system and transformed with the correct logic, allowing for meaningful information to be delivered in a timely manner. In this position, you will develop and deploy the extraction, transformation, and loading (ETL) routines for the data warehouse environment. This involves sourcing data from operational systems, applying business transformation rules, and loading it into the data warehouse and data mart environments. You will analyze and develop strategies for data acquisitions, archives, recovery, and implementation of the ETL repository and load jobs. Determining the optimal approach for obtaining data from diverse source system platforms and moving it to the data warehouse will be a key responsibility. Additionally, you will implement and execute strategies for change data capture, create testing methodologies and criteria, and perform detailed data analysis, including monitoring performance production data for quality and consistency. Investigating, analyzing, documenting, and correcting reported defects will also be part of your duties. You will develop specifications for data cleansing, data scrubbing, ETL, and data migration processes, and maintain standards, common processes, and best practices for ETL architecture. Your role will also involve designing and implementing automation of data loading and cleansing processes, modifying ETL processes/pipelines to accommodate source system changes, and populating the data warehouse using data modeling techniques such as Star schemas, Snowflake schemas, and highly normalized data models. You will interpret written business requirements and transform them into technical specifications, creating and maintaining technical specification documentation throughout the process.

Responsibilities

  • Develop and deploy the extraction, transformation, and loading routine for the data warehouse environment.
  • Analyze and develop strategies for data acquisitions, archives, recovery, and implementation of the ETL repository and load jobs.
  • Determine the optimal approach for obtaining data from diverse source system platforms and moving it to the data warehouse.
  • Implement and execute strategy for change data capture.
  • Create testing methodology and criteria.
  • Perform detailed data analysis including data quality and consistency by monitoring performance production data.
  • Investigate, analyze, document, and correct reported defects.
  • Develop specifications for data cleansing, data scrubbing, ETL and data migration processes.
  • Develop and maintain standards, common processes and best practices for ETL architecture including batch jobs auditing and balancing.
  • Design and implement automation of data loading and cleansing processes.
  • Modify ETL processes/pipelines to accommodate source system changes with new business user requirements.
  • Populate the data warehouse using data modeling techniques for target structures such as Star schemas, Snowflake schemas, and highly normalized data models.
  • Interpret written business requirements and transform into technical specifications.
  • Create and maintain technical specification documentation.

Requirements

  • Extensive hands-on experience with SQL query language development.
  • 4+ years of experience in an ETL tool such as SSIS, Azure Data Factory, Informatica, DataStage.
  • 4+ years of experience in developing data warehouses/data marts.
  • 3+ years of experience in developing cloud-based pipelines in Azure Data Factory / AWS Glue.
  • Excellent knowledge of data warehouse concepts, technologies, and best business practices to include data sourcing, data transformation, data loading, data cleansing, data integrity checking, and end-user tools.
  • Hands-on experience with at least 1 massive parallel platform Netezza, Teradata, Synapse.
  • Knowledge of data warehousing tools, patterns, and processes.
  • Experience in database design and well versed with SQL Server best practices.
  • Excellent interpersonal, written, and verbal communication skills.

Nice-to-haves

  • Mainframe DB2 experience
  • Familiarity with Agile methodology

Benefits

  • Medical insurance
  • Dental insurance
  • Vision insurance
  • Basic life insurance
  • 401k plan
  • Paid time off
  • Paid holidays
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service