Deutsche Bank - Cary, NC

posted 5 days ago

Full-time - Mid Level
Remote - Cary, NC
Credit Intermediation and Related Activities

About the position

The Cloud Data Platform Engineer at Deutsche Bank is responsible for designing, implementing, and maintaining the infrastructure and tools necessary for a cloud-based data platform. This role involves collaboration with data engineers, analysts, and cross-functional teams to ensure the platform meets organizational needs and supports data-driven initiatives. The position emphasizes a hands-on engineering culture and offers opportunities for professional development within a flexible working environment.

Responsibilities

  • Design, implement, and maintain scalable and reliable data infrastructure solutions on Google Cloud Platform.
  • Develop and manage data pipelines to ingest, process, and store data from various sources into the cloud data platform.
  • Optimize data storage and retrieval processes for performance, scalability, and cost-effectiveness.
  • Collaborate with data engineers, data scientists, and analysts to understand their requirements and provide effective solutions.
  • Implement and maintain data governance and security measures to ensure data confidentiality, integrity, and availability.
  • Monitor, troubleshoot, and ensure overall observability of data platform issues, ensuring solution resiliency and high availability.
  • Stay updated with the latest cloud technologies and recommend improvements to data platform architecture and processes.
  • Gather requirements from stakeholders, prioritize tasks, and deliver projects on time and within budget.

Requirements

  • Moderate experience in data engineering, cloud engineering, or a related field.
  • Experience with big data technologies such as Hadoop, Spark, and Kafka.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Strong experience with Google Cloud Platform.
  • Familiarity with database systems such as Hive, Impala, PostgreSQL, or NoSQL databases.
  • Hands-on experience with data pipeline observability tools such as Apache Airflow or Luigi.
  • Knowledge of data governance and security best practices.

Nice-to-haves

  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration skills.
  • Bachelor's degree or equivalent in Computer Science, Engineering, or a related field.

Benefits

  • Diverse and inclusive environment that embraces change, innovation, and collaboration.
  • Hybrid working model with up to 60% work from home.
  • Generous vacation, personal and volunteer days.
  • Commitment to Corporate Social Responsibility.
  • Access to Employee Resource Groups for community engagement.
  • Competitive compensation packages including health and wellbeing benefits, retirement savings plans, parental leave, and family building benefits.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service