Data Architect

$101,400 - $183,300/Yr

Leidos - Marietta, GA

posted 2 months ago

Full-time - Mid Level
Marietta, GA
Professional, Scientific, and Technical Services

About the position

The Health Mission Solutions is seeking a Data Architect, contingent upon contract award. The Data Architect will work closely with the Program Manager, Cloud Solutions Architect, data engineers, and data analysts to support multiple public health project teams in their full data management cycle including data ingestion, data cleansing, data transformation, data security, data exploration, and data visualization using Microsoft Azure tools and technologies. These tools include Databricks, SPARK Streaming, Azure SQL, Delta Lake, Azure Data Factory, HD Insights, and Notebook. This position requires hands-on experience with data architecture in an Azure cloud environment. Candidates MUST be located in the United States for the current consecutive three years and be eligible to obtain a Public Trust level 5 clearance. Specific roles & responsibilities for the Data Architect position include advising, supporting, and coaching project teams in ingesting data, creating data pipelines, selecting the appropriate Azure services, optimizing data storage, cataloguing data, enforcing technical & architectural standards, and troubleshooting development & production issues. The Data Architect will design and implement data security measures to ensure PII/PHI data is protected from unauthorized access. They will incorporate data governance into the solution design, which includes policies, procedures, and standards for managing and using data. Continuous optimization of the performance of data pipelines in Databricks and Azure Data Factory (ADF) is also a key responsibility. The Data Architect will investigate and recommend new technologies to modernize the data pipeline process and stay current on the latest advancements in data technologies. Collaboration with customer SMEs on data projects to develop data pipeline architectures and strategies is essential. Mentoring project teams and data engineers on best practices and new technologies is expected, as well as collaborating with data engineers, business analysts, and testers to drive agile development teams to implement data architecture. The role involves actively leading/participating in the discovery/validation/verification process throughout the development life cycle and engaging in process improvement initiatives. The Data Architect will identify, evaluate, and demonstrate solutions to complex system problems and design and develop documentation including procedures, process flow diagrams, work instructions, and protocols for processes.

Responsibilities

  • Advise, support, and coach project teams in ingesting data and creating data pipelines.
  • Select appropriate Azure services and optimize data storage.
  • Catalog data and enforce technical & architectural standards.
  • Troubleshoot development & production issues.
  • Design and implement data security measures to protect PII/PHI data.
  • Incorporate data governance into solution design, including policies and standards.
  • Continuously optimize the performance of data pipelines in Databricks and Azure Data Factory (ADF).
  • Investigate and recommend new technologies to modernize the data pipeline process.
  • Stay current on the latest advancements in data technologies.
  • Collaborate with customer SMEs on data projects to develop data pipeline architectures and strategies.
  • Mentor project teams and data engineers on best practices and new technologies.
  • Collaborate with data engineers, business analysts, and testers to drive agile development teams to implement data architecture.
  • Actively lead/participate in the discovery/validation/verification process throughout the development life cycle.
  • Engage in process improvement initiatives.
  • Identify, evaluate, and demonstrate solutions to complex system problems.
  • Design and develop documentation including procedures, process flow diagrams, work instructions, and protocols for processes.

Requirements

  • Bachelor degree from an accredited college in a related discipline, or equivalent experience/combined education, with 8+ years or more of professional experience; or 6+ years of professional experience with a related Master degree.
  • Proven data architecture experience on a large scale Azure Data Lake platform.
  • Experience onboarding and managing multiple data pipelines of high complexity and processing millions of records per day.
  • Experience working simultaneously with multiple data sources, entities submitting data daily to the datalake and deliver Technical Assistance to ensure successful operations.
  • Experience building Azure cloud-based ETL processes and data pipelines to automate data workflows.
  • Experience implementing automated processes to QC data products and pipelines before data release, including de-duplication of data.
  • Experience in handling and delivering big data analytics for daily users.
  • Strong prior experience with and expert knowledge of Databricks, Delta Lake, HD Insights, and Azure Data Factory specifically with data pipeline process architecture and automation of data pipelines.
  • Prior experience integrating applications with AI/ML technologies including chatbots.
  • Ability to collaborate with and influence customer leadership and external teams on data initiative strategies.
  • One or more relevant Microsoft Azure certifications.
  • Ability to present complex ideas and subject matter to stakeholders and customer leadership.
  • Proven experience working in a development environment following agile practices and processes.
  • Experience developing documentation including specifications, procedures, process flow diagrams, work instructions, and protocols for processes.
  • Proven experience with supporting highly critical customer missions.
  • Prior proven leadership experience.
  • Excellent verbal and written communication skills, including experience working directly with customers to discuss their requirements and objectives.
  • Proven experience in multi-tasking and managing efforts to the schedule.
  • Ability to learn and support new systems and applications.

Nice-to-haves

  • Working experience at CDC or other federal agencies.
  • Experience with Azure DevOps and CI/CD pipelines.
  • Azure Databricks Platform Architect certification, Databricks Accredited Lakehouse Fundamentals certification, or similar certifications.
  • Working experience with Tableau, SAS Viya, R, and/or python.
  • Experience with Transition-In to take over a large scale Azure based data lake platform.
  • Experience with agile development process.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service