ManTech - Chantilly, VA

posted 2 months ago

Full-time - Senior
Chantilly, VA
Professional, Scientific, and Technical Services

About the position

The Data Architect (SME) for the Enterprise Data Pipeline (EDP) Content Warehouse at ManTech is a pivotal role that involves designing and overseeing a scalable, cloud-based repository. This repository is essential for efficiently formatting, enriching, analyzing, and making data available to users from the Data Lake. The Content Warehouse serves as a centralized hub for parsing, indexing, and securing data, which is crucial for Enterprise Search, Analytics, and various mission-driven applications. The Data Architect will utilize advanced tools and technologies such as Apache NiFi, Apache Tika, Databricks, Lakehouse architecture, ElasticSearch, and AWS SQS to fulfill these responsibilities effectively. In this role, the Data Architect will collaborate closely with stakeholders to gather requirements and translate them into actionable data architecture designs that align with the organization's strategic goals. The position requires a strong understanding of cloud-based data processing and storage technologies, as well as experience in implementing and optimizing data processing workflows. The Data Architect will also be responsible for ensuring data security and compliance within the Content Warehouse, adhering to relevant policies, standards, and regulations. The successful candidate will stay abreast of emerging technologies and best practices in data architecture, cloud computing, and big data analytics. They will provide technical guidance and support to team members and stakeholders, and will be expected to prepare and deliver presentations to communicate project updates, recommendations, and findings to colleagues, subordinates, and government representatives. This role is crucial for the success of FBI missions and requires specialized technical expertise in agile methodologies, particularly the Scaled Agile Framework (SAFe).

Responsibilities

  • Design and architect the scalable, cloud-based repository for the Enterprise Data Pipeline (EDP) Content Warehouse.
  • Ensure the efficient formatting, enrichment, analysis, and availability of data within the Content Warehouse.
  • Collaborate with stakeholders to gather requirements and translate them into data architecture designs that align with organizational goals.
  • Implement and optimize data processing workflows using tools such as Apache NiFi and Apache Tika.
  • Leverage Databricks and Lakehouse architecture to enable efficient data storage, processing, and analytics.
  • Implement search capabilities using ElasticSearch to enable Enterprise Search functionality.
  • Ensure data security and compliance within the Content Warehouse, adhering to relevant policies, standards, and regulations.
  • Collaborate with cross-functional teams to integrate data from various sources into the Content Warehouse.
  • Stay updated on emerging technologies and best practices in data architecture, cloud computing, and big data analytics.
  • Provide technical guidance and support to team members and stakeholders.
  • Prepare and deliver presentations to colleagues, subordinates, and government representatives to communicate project updates, recommendations, and findings.

Requirements

  • Bachelor's degree in Business, Engineering, Management Sciences, Computer Science, Information Systems, Social Science, Education, Human Resources Development, Psychology, or other related disciplines.
  • Minimum of ten to twelve years of experience in data architecture, data modeling, and database design.
  • Strong knowledge and experience with cloud-based data processing and storage technologies.
  • Proficiency in tools such as Apache NiFi, Apache Tika, Databricks, Lakehouse architecture, ElasticSearch, and AWS SQS.
  • Experience with data integration, transformation, and enrichment processes.
  • Familiarity with data security and compliance requirements in a cloud environment.
  • Strong analytical and problem-solving skills, with the ability to design and optimize data workflows.
  • Excellent communication and presentation skills, with the ability to effectively convey complex technical concepts to both technical and non-technical audiences.
  • Ability to work collaboratively with cross-functional teams and stakeholders.
  • Relevant certifications such as AWS Certified Solutions Architect, Databricks Certified Developer, or Elastic Certified Engineer are highly desirable.

Nice-to-haves

  • Experience with Scaled Agile Framework (SAFe) implementation.
  • Knowledge of FBI mission requirements and data needs.

Benefits

  • Competitive salary and performance bonuses.
  • Health, dental, and vision insurance.
  • 401(k) retirement plan with company matching.
  • Paid time off and holidays.
  • Professional development opportunities and training programs.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service