This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

Capgemini Holdingposted 28 days ago
Full-time • Mid Level
Poznan, OH
Professional, Scientific, and Technical Services
Resume Match Score

About the position

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Insights & Data is a dynamic team of over 400 professionals delivering cutting-edge data solutions. We specialize in Cloud & Big Data engineering, building scalable systems for complex datasets across AWS, Azure, and GCP. Our expertise spans the full Software Development Life Cycle (SDLC), utilizing modern data processing tools, advanced programming techniques, and DevOps best practices to create impactful solutions. Come on Board!

Responsibilities

  • Design and implement Azure-based data solutions for large-scale and unstructured datasets.
  • Develop and optimize data pipelines using Azure Data Factory, Databricks, or Snowflake.
  • Collaborate with solution architects to define and implement best practices in data engineering.
  • Ensure data quality, scalability, and security across all Azure-based solutions.

Requirements

  • At least 3 years of experience in data engineering, including 1+ year working with Azure.
  • Strong Python for data processing and automation.
  • Hands-on experience with one of the following: Databricks, Snowflake, or Microsoft Fabric.
  • Strong communication skills and very good English language skills.

Nice-to-haves

  • Strong SQL skills and experience with database optimization.
  • Knowledge of DevOps practices, CI/CD pipelines, and Infrastructure as Code (IaC) tools (Terraform, Bicep).
  • Familiarity with key Azure services: Data Lake, Event Hub, Data Factory, Synapse Analytics, Azure Functions.
  • Exposure to containerization (Docker, Kubernetes) and cloud security best practices.
  • Experience in real-time data processing and streaming technologies like Kafka or Spark Streaming.
  • Hands-on experience with PySpark.
  • Certifications such as DP-900, DP-203, AZ-204, or AZ-400.

Benefits

  • Permanent employment contract from the first day.
  • Hybrid, flexible working model.
  • Equipment package for home office.
  • Private medical care with Medicover.
  • Life insurance.
  • Capgemini Helpline.
  • NAIS benefit platform.
  • Access to 70+ training tracks with certification opportunities.
  • Platform with free access to Pluralsight, TED Talks, Coursera, Udemy Business and SAP Learning HUB.
  • Community Hub that will allow you to choose from over 20 professional communities.

Job Keywords

Hard Skills
  • Azure Data Lake
  • Azure Pipelines
  • Docker
  • Java
  • Kubernetes
  • 1yBC7Ld9fez
  • 3RKb80jNY YlQLIyuaO6hT
  • 5EVHo fmg6C1uQ4dZ
  • 5JyAQhkV6X
  • 74mIlTJ6cRu5X1 wFzGkWjcq2h9o3r
  • 76XCtKlN
  • 78Uv
  • 7bMIRg4KFsH8f vENnwDURV8muso
  • cJ0Mp tUevx5db
  • Ckvic 4AJfmHTtayi
  • DbmCA lkDin6yrBUa
  • FuVIWh 4gizo9wSaC
  • gSEoV4JjraUHuv8 LV7 rTJAc
  • gT2wPC4
  • J608lcDpw
  • jLbX9h4v xo9Q2jf3N
  • n5iAoEQHmDP
  • PekHmu 7wC3lrjnTZ
  • PqGfYV
  • q3HpyN bS5KjioBz
  • rJAI1i2sTh 92BQ 71t2dO4QEw
  • RTKvY dC3XVbqGkEFN
  • V2eqB 56bD
  • WA4Dv2Gmri
  • xGdlj bOmypkl9Ix
  • xT8Rw ap9ZqV3tojvk
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service