Shorepoint - Herndon, VA

posted 3 days ago

Full-time - Mid Level
Herndon, VA
Professional, Scientific, and Technical Services

About the position

The Product Engineering Specialist - Kafka is responsible for designing, implementing, and optimizing Kafka-based data streaming architectures specifically for cybersecurity data collection and processing. This role involves maintaining Kafka clusters, ensuring high availability and fault tolerance, and collaborating with integration engineers to create efficient data pipelines. The position offers a unique opportunity to contribute to a fast-growing cybersecurity firm while supporting federal clients and engaging in agile methodologies.

Responsibilities

  • Design, implement, and optimize Kafka-based data streaming architectures for cybersecurity data collection and processing.
  • Develop and maintain Kafka clusters, ensuring high availability, fault tolerance, and scalability.
  • Configure and tune Kafka for optimal performance, considering factors such as partitioning, replication, and consumer group strategies.
  • Implement data replication strategies between edge Kafka deployments and centralized Kafka clusters.
  • Collaborate with integration engineers to design and implement efficient data pipelines from data source to Kafka to the Elastic Stack.
  • Engage in all agile ceremonies including backlog grooming, demos and retrospectives.
  • Provide expertise and guidance on Kafka security features, including encryption, authentication, and authorization.
  • Conduct capacity planning and performance testing for Kafka deployments.
  • Troubleshoot complex issues in Kafka systems.
  • Develop and maintain documentation for Kafka configurations, best practices, and troubleshooting procedures.

Requirements

  • Strong experience with Kafka and other big data, distributed and data streaming technologies.
  • In-depth knowledge of all the functionalities surrounding Kafka.
  • Proficient with Java or Python for developing Kafka-related applications and tools.
  • Ability to install, maintain and troubleshoot Kafka.
  • Understanding of data serialization formats (e.g. Avro, Protobuf) and schema management.
  • Ability to design secure configurations and access to shared Kafka deployments.
  • Excellent troubleshooting skills.
  • Excellent communication and interpersonal skills.
  • 5 years of relevant experience.
  • Ability to design, build and maintain message configuration and flows in high-throughput, low-latency scenarios.
  • Strong problem-solving skills and ability to provide issue analysis on Kafka applications and other complex distributed systems.
  • Experience documenting tests and presenting findings.
  • Ability to obtain agency required clearance.

Nice-to-haves

  • BS in Computer Science, Information Systems, Mathematics, Engineering, related degree preferred.
  • Industry related certifications preferred.
  • Familiarity with containerization and orchestration technologies (Docker, Kubernetes).
  • Experience deploying Kafka in cloud-based environments (AWS, Azure, GCP).
  • Experience with Infrastructure as Code tools for deploying and managing Kafka clusters.

Benefits

  • 401(k)
  • Health insurance
  • Paid time off
  • 18 days of PTO
  • 11 holidays
  • 80% of insurance premium covered
  • Continued education and certifications maintenance and reimbursement.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service