Senior Data Streaming Engineer

$125,000 - $160,000/Yr

Hoopla - Holland, OH

posted about 2 months ago

Full-time - Senior
Remote - Holland, OH
Clothing, Clothing Accessories, Shoe, and Jewelry Retailers

About the position

As a Senior Data Streaming Engineer at Midwest Tape, you will play a crucial role in designing and building high-performance streaming solutions. This position involves collaborating with other engineers to create robust real-time data pipelines using technologies such as Apache Kafka and Flink. You will be responsible for ensuring the implementation of microservices principles and domain-driven design best practices while fostering a collaborative environment for knowledge sharing and continuous improvement.

Responsibilities

  • Lead the collaborative design and implementation of event-driven, real-time streaming architectures using Apache Kafka and Apache Flink.
  • Foster a collaborative environment with fellow streaming developers, promoting knowledge sharing, mentorship, and continuous refinement of best practices in stream processing, fault tolerance, and scalability.
  • Architect and implement production-grade, fault-tolerant Kafka pipelines and Flink applications using Java 17+, leveraging Flink's DataStream, Table, or SQL APIs to process high-volume, low-latency data streams.
  • Ensure compliance with company policies, data governance standards, and industry regulations in all aspects of streaming development and operations.
  • Advocate and enforce best practices for stream processing, code quality, testing strategies, and maintainability to build a resilient and future-proof streaming infrastructure.
  • Engage in solution architecture discussions, provide technical guidance, and conduct thorough code reviews to uphold high standards of software craftsmanship and system performance.
  • Drive continuous improvement by identifying and proposing enhancements to operational workflows, technical stack, and development methodologies, focusing on efficiency, scalability, and cost-effectiveness.
  • Collaborate with fellow engineers, DevOps and operations teams to proactively monitor, troubleshoot, and optimize streaming applications.
  • Contribute to cross-functional initiatives, knowledge sharing sessions, and documentation efforts to elevate the team's expertise in Kafka, Flink, and event-driven architectures.

Requirements

  • Proven expertise in designing, implementing, and optimizing event-driven, real-time streaming architectures using Confluent Kafka and Apache Flink.
  • Advanced proficiency in Java 17+ and extensive experience with Flink's DataStream, Table, and SQL APIs for developing complex stream processing applications.
  • Deep understanding of microservices architecture, domain-driven design (DDD) principles, and event sourcing and CQRS patterns, with a track record of applying these concepts in stream processing systems.
  • Extensive experience with Kafka ecosystem, including Kafka Connect for scalable and fault-tolerant data ingestion/egress, and familiarity with common connectors for databases, message queues, and cloud services.
  • Demonstrated ability to provide technical leadership, mentor team members, and foster a collaborative environment that promotes knowledge sharing and continuous improvement.
  • Strong SQL skills, including the ability to write, optimize, and review complex queries, especially in the context of stream-table joins and windowing operations in Flink SQL.
  • Exceptional problem-solving, debugging, and performance tuning skills, with experience in root cause analysis of issues in distributed streaming systems.
  • Proficiency with the Spring Framework, particularly Spring Boot and Spring Cloud, for building and deploying microservices-based streaming applications.
  • Hands-on experience with in-memory data stores like Memcached and Redis for caching, state management, and enhancing the performance of streaming applications.
  • Solid understanding of AWS cloud for deploying, scaling, and managing streaming workloads.
  • Experience with DevOps practices, CI/CD pipelines, and containerization technologies (Docker, Kubernetes) to streamline the deployment and management of streaming applications.
  • Proficiency in Agile/Scrum methodologies, with experience in sprint planning, daily stand-ups, and iterative development in a data-driven environment.
  • Familiarity with collaboration tools such as JIRA and Confluence.
  • Excellent interpersonal, written, and verbal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical stakeholders.
  • Experience with data governance, schema evolution strategies (Avro, and Json), and ensuring data quality and consistency in streaming pipelines.
  • Knowledge of monitoring and observability tools (DataDog) for real-time insights into streaming application performance and data flows.

Nice-to-haves

  • B.S./M.S., in software engineering, computer science/related area, or equivalent experience.
  • Possessing a relevant certification (e.g., Confluent Certified Developer for Apache Kafka®) demonstrates your commitment to professional development in this field.
  • 8+ years developing scalable, fault-tolerant full-stack or backend systems in Java, with focus on event-driven and real-time applications.
  • Deep expertise with Flink's SQL API for stream processing, including experience with complex event processing, temporal tables, UDFs to enrich streaming data.
  • Advanced knowledge of Flink state management using RocksDB, Apache Ignite, and optimal checkpointing strategies.
  • Experience with schema evolution strategies (Avro, Json) in Kafka and Flink to ensure data compatibility across pipeline upgrades.
  • Experience with Docker for containerization and Kubernetes (preferably with Helm) for orchestrating streaming deployments.
  • Strong DevOps skills including Git, CI/CD pipelines, and Infrastructure as Code (Terraform).
  • Knowledge of stream processing performance tuning and benchmarking.
  • Knowledge of data modeling techniques for streaming data, including temporal modeling and SCDs.

Benefits

  • Medical, dental, & vision insurance
  • 401k + match
  • Profit sharing
  • Paid vacation and personal time
  • Flex time
  • 10 paid holidays
  • Company performance bonus
  • Holiday bonus
  • Paid time to volunteer
  • Training & career development opportunities
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service