Social Finance - San Francisco, CA

posted about 1 month ago

Full-time - Senior
Remote - San Francisco, CA
Religious, Grantmaking, Civic, Professional, and Similar Organizations

About the position

The Data Manager for Risk Engineering & AI at SoFi is a senior-level role focused on developing data-intensive solutions to enhance data-driven decision-making within the Independent Risk Management function. This position involves collaborating with senior leadership to identify opportunities for innovative data and AI solutions, ultimately shaping risk oversight strategies. The role requires hands-on experience in data engineering, API development, and data governance, with a strong emphasis on building scalable data pipelines and ensuring data quality.

Responsibilities

  • Design, develop, and maintain scalable and efficient data pipelines to ingest, process, and store diverse data sets from multiple sources.
  • Perform data cleaning, validation, and transformation to prepare high-quality data sets for analytics, reporting, and machine learning.
  • Build and manage data staging environments and ensure optimal storage of structured and unstructured data for downstream applications.
  • Create and maintain APIs to expose data for internal teams, ensuring secure and reliable data access.
  • Collaborate with AI/ML specialists and risk financial reporting analysts to deploy enterprise-grade data processing pipelines to production.
  • Implement best practices for data governance, including data lineage, versioning, and monitoring to maintain data accuracy and reliability.
  • Identify areas for process improvement, automation, and optimization within the data pipeline and implement innovative solutions.

Requirements

  • Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.
  • Minimum of 8+ years of experience in data engineering or a related role building analytics solutions leveraging Git workflows, modularized coding practices, API calls, and web-based solutions.
  • Expert/advanced proficiency in Python for data manipulation and transformation (e.g., pandas, NumPy).
  • Strong experience in querying database schemas using advanced SQL for data manipulation, querying, and database management.
  • Deep hands-on experience with core technologies on Snowflake, Airflow, dbt, git, docker, Tableau, Streamlit, SQL.
  • Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB) and data lakes for handling large amounts of text data.

Nice-to-haves

  • Prior experience in working with machine learning teams and AI-based solutions deployment is preferred.

Benefits

  • Comprehensive and competitive benefits package including health insurance, retirement plans, and paid time off.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service