Amentum - Topeka, KS

posted 2 months ago

Full-time
Topeka, KS
Professional, Scientific, and Technical Services

About the position

The Data Integration Engineer at Amentum is responsible for designing, developing, and implementing enterprise data integration and pipeline architectures. This role focuses on facilitating the collection, storage, and management of data for various analytical, operational, and decision-making purposes, ensuring effective integration between Amentum systems and external data sources.

Responsibilities

  • Design and implement data integration and pipeline architectures to facilitate data collection, storage, and management.
  • Develop and maintain ETL processes and data workflows for efficient data transfer.
  • Select and implement suitable integration methods for enterprise needs.
  • Create and enforce data quality checks to ensure accuracy and integrity of integrated data.
  • Collaborate with data engineers and analysts to deliver tailored data resources for analytics and reporting.
  • Optimize data integration processes for improved performance and reliability.
  • Implement data privacy and security measures in compliance with regulations.
  • Troubleshoot and resolve data integration issues, including performance bottlenecks.
  • Document data integration processes for clarity and maintainability.
  • Design and manage frameworks for enterprise data analysis and processing.
  • Recommend and implement improvements for data reliability and quality.
  • Process, clean, and verify the integrity of enterprise data sets.
  • Design and operate data management systems for business intelligence needs.
  • Plan and optimize data throughput and query performance.
  • Build data and analytics proofs for deeper insights into datasets.

Requirements

  • Bachelor's degree in Computer Science or related field and 5 years of experience.
  • Excellent communications and analytical skills.
  • Demonstrated working knowledge of Microsoft SQL (T-SQL), Oracle SQL (PL-SQL), Azure Synapse, Azure ADLS, Azure Pipelines, Azure Spark Notebooks, REST API, JSON Files, Parquet Files, Delta Lake Files, Data Vault 2.0, SQL Serverless, Synapse Data Warehouse, Microsoft DevOps, and GitHub.
  • Experience with Agile Development and Data Encryption/Decryption.
  • Knowledge of API-based Integration and File transfer Tools such as GoAnywhere.

Nice-to-haves

  • Experience with MuleSoft, Boomi, WebsphereMQ, RabbitMQ, or Apigee.
  • Knowledge of various integration patterns including Asynchronous vs Synchronous integration.
  • Knowledge and application of Event driven integration.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service