There are still lots of open positions. Let's find the one that's right for you.
The position requires core technical skills in Big Data technologies such as HDFS, Hive, Spark, HDP/CDP, ETL pipeline, SQL, Ranger, Python, and Databricks. Familiarity with Cloud services, preferably both AWS and Azure, is essential, including S3/ADLS, Delta Lake, KeyVault, Hashicorp, and Splunk. The candidate should have knowledge of Data Quality & Governance and hands-on experience with tools like DataIku or Dremio. The role involves leading projects, ensuring timely status reporting, and managing smooth release processes to deliver projects on time.