SAIC - Chantilly, VA

posted 2 months ago

Full-time - Senior
Onsite - Chantilly, VA
Professional, Scientific, and Technical Services

About the position

SAIC is seeking an experienced, results-oriented, mission-driven Big Data Architect with a specialized focus on Data Engineering. This role involves performing data model design, data formatting, and ETL (Extract, Transform, Load) development optimized for efficient storage, access, and computation in support of national security objectives. The Big Data Architect will be a key player in an Agile team, tasked with increasing innovation capacity and driving the velocity of development of data ingestion and data analysis processes. The responsibilities of the Big Data Architect include synchronizing efforts with other tasks to assemble data technologies that control the flow of data from source to value. The goal is to speed up the process of deriving value and insight from data. The ideal candidate will have a passion for unlocking the secrets held by datasets and possess a solid understanding and experience in developing, automating, and enhancing all parts of the data pipeline, including ingestion, processing, storage, and exposing data for consumption. Additionally, the Big Data Architect will implement data tests for quality assurance and focus on improving inefficient tooling while adopting new transformative technologies to maintain operational continuity. This position requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that meet the needs of national security objectives.

Responsibilities

  • Perform data model design and data formatting.
  • Develop ETL processes optimized for efficient storage and access.
  • Increase innovation capacity and drive the velocity of data ingestion and analysis.
  • Synchronize efforts with other tasks to control the flow of data from source to value.
  • Implement data tests for quality assurance.
  • Improve inefficient tooling and adopt new transformative technologies.
  • Maintain operational continuity while enhancing data pipelines.

Requirements

  • Active TS/SCI with Polygraph Clearance
  • Bachelor's Degree in Computer Science, Information Systems, Engineering, or equivalent experience
  • 14 years of overall related professional experience
  • 3+ years of hands-on development experience using Java, JavaScript, and Python for ETL processes
  • Experience with data formats such as XML, JSON, and YML
  • 3+ years of experience using and ingesting data into SQL and NoSQL database systems
  • Familiarity with the NEXIS platform
  • Experience with Apache NiFi
  • Experience programming in Apache Spark and PySpark

Nice-to-haves

  • Familiarity with building Containerized Services (e.g. via Docker)
  • Familiarity with Databricks platform
  • Experience developing and maintaining data processing flows
  • Experience with Amazon Web Services (AWS)
  • Experience with CI/CD pipeline
  • Experience with Agile Methodologies and Kanban Framework
  • Experience with relational databases including MySQL and/or Oracle for designing database schemas
  • Experience with Linux, REST services, and HTTP
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service