Société Générale - Chicago, IL

posted 2 months ago

Full-time - Senior
Hybrid - Chicago, IL
Credit Intermediation and Related Activities

About the position

The Senior Big Data Java Engineer will be responsible for developing and maintaining a data lake streaming platform in Azure, focusing on creating scalable and resilient systems. This role involves working closely with the Global Banking Technology & Operations team to deliver innovative IT solutions for the Equity Prime Services business line, ensuring high-quality code and effective collaboration within the team.

Responsibilities

  • Architect, design and develop Kafka Stream based Java applications in Azure
  • Architect, design and develop data pipelines for Big Data volume using Spark (Java) in Azure
  • Write high quality code in Java
  • Design, develop and deploy systems with scalability and resiliency in mind
  • Review code, offer improvements to design, process and help team improve
  • Work with distributed systems with huge volume of data
  • Troubleshoot performance issues
  • Support deconstruction of customer requests into detailed stories by interacting with the Product Owner
  • Deliver working code that meets acceptance criteria and the definition of done
  • Write code, deploy scripts, unit test, check code to source code repository, and monitor delivery pipeline activity
  • Conduct testing, deployment, and production activities to ensure production stability
  • Engage in pair programming to write high quality code
  • Write tests at unit level with Junit, Mockito and in BDD style with Cucumber
  • Attend backlog refinement and planning sessions to discuss and estimate upcoming stories

Requirements

  • 7+ years of experience working as Java Senior Programmer
  • 3+ years of experience working in Spark, Kafka and cloud
  • Experience with Java, Kafka Streams and Spark
  • Good working knowledge of distributed systems
  • Comfortable with system design for Big Data systems for both batch and real time
  • Working experience with Spark jobs and troubleshooting performance issues
  • Experience dealing with high volume of data - batch and real-time processing
  • Cloud experience in AWS or Azure
  • Sound knowledge of Spring Boot or another Java back-end framework, Kafka, Elastic Search, Kibana, & Kubernetes
  • Strong experience with Cloud & Big Data technologies like Spark and Kafka
  • Designing RESTful APIs and integrating third party RESTful APIs
  • Familiarity with code revising and branding, ideally Git
  • Comfortable working in agile methodologies, ideally Scrum
  • Experience with automated testing approaches - test driven development, unit testing, integration testing, and BDD testing
  • Exposure to continuous integration tools
  • Understanding of service-oriented architectures and message brokers
  • Strong analytical skills and problem-solving ability

Benefits

  • Hybrid work arrangement allowing flexibility to work remotely and on-site
  • Diversity and Inclusion initiatives
  • Commitment to employee development and advancement
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service