ETL Java Developer (SME) Jobs

$144,300 - $260,850/Yr

Leidos - Herndon, VA

posted 3 months ago

Full-time
Herndon, VA
Professional, Scientific, and Technical Services

About the position

We have an IMMEDIATE NEED for an ETL Java Developer SME to provide Agile DevOps support to mission-critical systems. As an ETL Java Developer on this program, you will have the opportunity to build strong systems, software, and cloud environments and provide operations and maintenance for critical systems. The candidate will provide technical expertise and support in the design, development, implementation, and testing of customer tools and applications in support of Extracting, Transforming, and Loading (ETL) of data into an enterprise Data Lake. Based in a DevOps framework, you will participate in and/or direct major deliverables of projects through all aspects of the software development lifecycle including scope and work estimation, architecture and design, coding, and unit testing. This role requires the development of custom code/scripts to quickly extract, triage, and exploit data across domains and data stores. You will be responsible for designing and implementing a large-scale ingest system in a big data environment. This includes reading, analyzing, and digesting what the enterprise needs to accomplish with its data and designing the best possible ELT process to support those objectives. You will also be responsible for recommending methodologies to optimize the visualization, organization, storage, and availability of large-scale data in support of enterprise requirements. In addition, you will participate in software programming initiatives to support innovation and enhancement, using technologies such as HTML, CSS, JavaScript, Java, Python, SpringBoot, and Hibernate. You will develop and direct software system validation and testing methods using Junit and Katalon, and develop, prototype, and deploy solutions within Commercial Cloud Solutions leveraging Infrastructure platform services. Analyzing through proof of concept, performance, and end-to-end testing will be essential to effectively coordinate Infrastructure needs driven by developed software to meet customer mission needs. You will support the Agile software development lifecycle following Program SAFe practices and use industry-leading DevOps tools like GitHub, Jenkins, and Unix bash scripting. Documentation and performance of systems software development, including deployment of build artifacts across different environments leveraging GitFlow constructs, will also be part of your responsibilities. You will leverage the Atlassian tool suite like JIRA and Confluence to track activities and apply and identify best practices and standard operating procedures. Close coordination with team members, Product Owners, and Scrum Masters will be necessary to ensure User Story alignment and implementation to customer use cases. Additionally, you will communicate key project data to team members and build team cohesion and effectiveness, holding meetings with PMO and enterprise stakeholders.

Responsibilities

  • Designing and implementing a large scale ingest system in a big data environment.
  • Reading, analyzing, and digesting enterprise data needs to design the best possible ELT process.
  • Recommending methodologies to optimize visualization, organization, storage, and availability of large scale data.
  • Participating in software programming initiatives using HTML, CSS, JavaScript, Java, Python, SpringBoot, and Hibernate.
  • Developing and directing software system validation and testing methods using Junit and Katalon.
  • Developing, prototyping, and deploying solutions within Commercial Cloud Solutions leveraging Infrastructure platform services.
  • Analyzing infrastructure needs driven by developed software to meet customer mission needs through proof of concept, performance, and end-to-end testing.
  • Supporting the Agile software development lifecycle following Program SAFe practices.
  • Using industry leading DevOps tools like GitHub, Jenkins, and Unix bash scripting.
  • Documenting and performing systems software development, including deployment of build artifacts across different environments leveraging GitFlow constructs.
  • Leveraging Atlassian tool suite like JIRA and Confluence to track activities.
  • Applying and identifying best practices and standard operating procedures.
  • Coordinating closely with team members, Product Owners, and Scrum Masters to ensure User Story alignment and implementation to customer use cases.
  • Communicating key project data to team members and building team cohesion and effectiveness.
  • Holding meetings with PMO and enterprise stakeholders.

Requirements

  • Demonstrated experience performing ETL activities including parser development and deployment, data flow management, implementing data lifecycle policies, troubleshooting data access issues, and developing data models.
  • Experience with data modeling.
  • Extensive experience with relational databases, such as MySQL, that utilize SQL queries.
  • Extensive experience with using Java for data processing, manipulation, or querying (SQL or NoSQL).
  • ETL/Data Integration experience using Spring, NiFi, Kafka, and Elasticsearch.
  • Experience with development in Commercial Cloud Platforms (e.g., AWS, Google Cloud, Azure).
  • Experience with development leveraging cloud data services (e.g., S3, RDS, EFS).
  • Excellent communication skills (written and verbal).
  • Experience leading development scrum teams.
  • Candidate must have a Masters with 15+ years of prior relevant experience.
  • Candidate must have an active TS/SCI w/polygraph.

Nice-to-haves

  • Demonstrated experience using Neo4J.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service