Standard Charteredposted about 1 month ago
Full-time - Mid Level
Bangalore, IN
Credit Intermediation and Related Activities

About the position

The individual will be part of the AAIOps team, responsible for shaping a state-of-the-art ecosystem(in terms of tools, technology and datamart) for data, analytics & AI roles within SCMAC while aligning with wider SCB tech simplification effort. The individual will play a crucial part in operationalizing AI/ML models or other analytical pipeline at scale, ensuring efficient and reliable deployment across diverse environments. Responsible for creation, maintenance, and successful execution of a machine learning operation & delivery (MLOps) ecosystem for the continuous delivery of AI artifacts (models, analyses, and predictions) developed by the SCMAC BA CoE, to WRB markets and to WRB digital solutions. Collaboration with both data scientists (SCMAC BA CoE) and software engineers (Enterprise Technology& WRB CIO Data Engineering teams) on the delivery of a fit-for-purpose big data ecosystem for building AI solutions in SCMAC BA CoE, transitioning from proprietary technology (SAS) to open-source technologies. Support the continuous improvement of the tools, technology, and data ecosystem available for all data roles in the SCMAC BA CoE. Implement best practices for version control, testing and CI/CD in ML pipelines. Stay updated on emerging technologies and trends in MLOps, advanced AI (Gen AI) and cloud-based data and analytics platforms (Databricks, Dataiku). Responsible for developing robust data workflows and pipelines using python and pyspark to process and analyze large-scale datasets to adopt new data feeds and feature stores, and in particular - incorporating our WRB unified data model (Athena) to reduce latency and improve the quality of data artifacts.

Responsibilities

  • Shape a state-of-the-art ecosystem for data, analytics & AI roles within SCMAC.
  • Operationalize AI/ML models or other analytical pipelines at scale.
  • Create, maintain, and execute a machine learning operation & delivery (MLOps) ecosystem.
  • Collaborate with data scientists and software engineers on big data ecosystem delivery.
  • Support continuous improvement of tools, technology, and data ecosystem.
  • Implement best practices for version control, testing, and CI/CD in ML pipelines.
  • Stay updated on emerging technologies and trends in MLOps and cloud-based platforms.
  • Develop robust data workflows and pipelines using Python and PySpark.

Requirements

  • Proficiency in Python and PySpark for data processing and engineering tasks.
  • Knowledge of Azure DevOps (ADO).
  • Understanding of machine learning concepts and model lifecycle management.
  • Familiarity with Big data ecosystem and cloud computing.

Nice-to-haves

  • Experience with Databricks and Dataiku.
  • Knowledge of advanced AI (Gen AI) technologies.

Benefits

  • Core bank funding for retirement savings, medical and life insurance.
  • Time-off including annual leave, parental/maternity leave, sabbatical, and volunteering leave.
  • Flexible working options based around home and office locations.
  • Proactive wellbeing support through Unmind and Employee Assistance Programme.
  • Continuous learning culture with opportunities to reskill and upskill.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service