Amentum - Boston, MA

posted 2 months ago

Full-time
Boston, MA
Professional, Scientific, and Technical Services

About the position

Amentum is seeking a dynamic and motivated Data Integration Engineer to design, develop, and implement enterprise data integration and pipeline architectures. This role focuses on facilitating the collection, storage, and management of data for various analytical, operational, and decision-making purposes, ensuring appropriate integration between Amentum systems and external data sources.

Responsibilities

  • Design and implement data integration and pipeline architectures to facilitate data collection, storage, and management.
  • Develop and maintain ETL processes and data workflows for efficient data transfer between systems.
  • Select and implement suitable integration methods for enterprise needs.
  • Create and enforce data quality checks to ensure accuracy and integrity of integrated data.
  • Collaborate with data engineers and analysts to deliver tailored data resources for analytics and reporting.
  • Optimize data integration processes for improved performance, scalability, and reliability.
  • Implement data privacy and security measures in compliance with regulations and company policies.
  • Troubleshoot and resolve issues related to data integration, including performance bottlenecks.
  • Document data integration processes, including data flow diagrams and data models.
  • Design and manage the overall framework for enterprise data analysis and processing.
  • Recommend and implement improvements for data reliability, efficiency, and quality.
  • Process, clean, and verify the integrity of enterprise data sets.
  • Design and operate data management systems for business intelligence needs.
  • Plan, design, and optimize data throughput and query performance.
  • Build data and analytics proofs for deeper insights into datasets.

Requirements

  • Bachelor's degree in Computer Science or related field and 5 years of experience.
  • Excellent communications and analytical skills.
  • Demonstrated working knowledge and experience with Microsoft SQL (T-SQL), Oracle SQL (PL-SQL), Azure Synapse, Azure ADLS, Azure Pipelines, Azure Spark Notebooks (SCALA, Python, Spark SQL), REST API, JSON Files, Parquet Files, Delta Lake Files, Data Vault 2.0, SQL Serverless, Synapse Data Warehouse (MPP - Dedicated), Microsoft DevOps and GitHub, Agile Development, Data Encryption/Decryption, API-based Integration, API Management, and File transfer Tools such as GoAnywhere.
  • Experience with one or more of the following: MuleSoft, Boomi, WebsphereMQ, RabbitMQ, or Apigee.
  • Knowledge of various integration patterns including Asynchronous vs Synchronous integration and Event driven integration.
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service