Fluxtek Solutions - East Pittsburgh, PA

posted 3 months ago

Full-time - Senior
East Pittsburgh, PA
Professional, Scientific, and Technical Services

About the position

As a Data Architect at Fluxtek Solutions, you will play a pivotal role in designing and developing scalable and high-performance systems utilizing Core Java and Apache Kafka. Your primary responsibility will be to integrate Apache Kafka with various systems and applications, enabling real-time data streaming and fostering an event-driven architecture. This position requires a deep understanding of data modeling and the ability to analyze and optimize the performance of Kafka clusters and related components, particularly in the context of Distributed Stream Processing (DSP) with Kafka streaming. You will be expected to leverage your extensive experience in software engineering, particularly with Kafka, to create robust data solutions that meet the needs of our clients. Your role will involve collaborating with cross-functional teams to ensure that the systems you design are not only efficient but also align with the strategic goals of the organization. You will be working in an Agile environment, which requires adaptability and a proactive approach to problem-solving. In addition to technical skills, strong communication and interpersonal skills are essential for this role, as you will be interacting with stakeholders at all levels. You will need to effectively convey complex technical concepts to non-technical team members and clients, ensuring that everyone is aligned on project objectives and timelines. Your ability to manage multiple priorities and meet project deadlines will be crucial to your success in this position.

Responsibilities

  • Design and develop scalable and high-performance systems using Core Java with Confluent or Apache Kafka.
  • Integrate Apache Kafka with various systems and applications to enable real-time data streaming and event-driven architecture.
  • Analyze and optimize the performance of Kafka clusters and related components, particularly in Distributed Stream Processing (DSP) with Kafka streaming.

Requirements

  • 10+ years in Software Engineering
  • Kafka Development experience
  • Bachelor's or master's degree in computer science, information technology, or a related field.
  • Experience working with Agile methodologies and in Agile environments.
  • Excellent communication and interpersonal skills to effectively collaborate with stakeholders at all levels.
  • Strong problem-solving and analytical skills to address complex technical challenges.
  • Ability to work independently, manage multiple priorities, and meet project deadlines.

Nice-to-haves

  • Data modeling experience (1 year preferred)

Benefits

  • 401(k)
  • Dental insurance
  • Health insurance
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service