Specialist - Data Engineering

$132,663 - $160,800/Yr

Charles Schwab - Westlake, TX

posted 2 months ago

Full-time
Remote - Westlake, TX
Securities, Commodity Contracts, and Other Financial Investments and Related Activities

About the position

The Data Engineering Specialist at Schwab is responsible for developing and implementing analytics applications that transform raw data into meaningful information. This role involves designing and developing data ingestion workflows across various data sources and patterns, leading small projects, and collaborating with cross-functional teams to ensure efficient implementation of data requirements. The position allows for 100% remote work and emphasizes the importance of quality assurance and adherence to coding standards.

Responsibilities

  • Develop and implement techniques or analytics applications to transform raw data into meaningful information.
  • Design, develop, and implement new data ingestion workflows using existing and new data engineering techniques.
  • Lead small projects and perform enhancements for successful delivery.
  • Work with business analysts to understand business needs, new data requirements, and use cases.
  • Craft and update ETL specifications and supporting documentation.
  • Collaborate with technical directors, data modelers, and cross-functional teams to ensure accurate implementation of requirements.
  • Define and execute quality assurance and test scripts.
  • Review ETL delivery from 3rd party vendor teams and advocate for agile practices to increase delivery efficiency.
  • Ensure consistency with published development, coding, and testing standards.
  • Train NERDs in understanding functionality/tools and technologies.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 36 months of experience in a related occupation.
  • Experience in developing and implementing data ingestion workflows using Informatica, Informatica Intelligent Cloud services (IICS), and Talend.
  • Experience in developing and implementing code and procedures for data ingestion and performing data analysis, performance tuning, and debugging using Teradata, GCP, BigQuery, and Google Cloud Storage.
  • Experience in developing Shell scripts in Unix for automating tasks related to data ingestion, analysis, and debugging.
  • Experience in developing code and scripts to transform raw data into meaningful information using Python, Java, and Scala.
  • Experience in performing CI/CD using Bamboo and Bitbucket tools.
  • Experience in batch monitoring to ensure data availability in the DWH using Control M.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service