American Express - Phoenix, AZ

posted 2 months ago

Full-time - Mid Level
Phoenix, AZ
Credit Intermediation and Related Activities

About the position

As a Data Engineer at American Express, you will be part of a dynamic and diverse tech team that is dedicated to architecting, coding, and shipping software that enhances our customers' digital experiences. This role is integral to the U.S. Consumer Services and Enterprise Digital & Data Technology Team, which focuses on foundational strategic technology capabilities including digital experience engineering, AI/ML, and marketing technology. You will work with the latest technologies in a supportive and inclusive environment where your contributions are recognized and valued. The primary focus of this role is to design, develop, and optimize data architectures on Google Cloud Platform (GCP). You will be responsible for building, testing, and deploying data pipelines that facilitate the movement, transformation, and processing of data from various sources to GCP. This includes utilizing GCP's big data technologies such as BigQuery, Dataflow, Dataprep, and Pub/Sub to implement effective data processing solutions. You will also monitor system performance, troubleshoot issues, and ensure the reliability and scalability of data pipelines. In this fast-paced Agile environment, you will collaborate with cross-functional teams to understand data requirements and develop solutions that meet business needs. Your role will also involve creating and maintaining comprehensive documentation for tools, architecture, processes, and solutions, ensuring that all aspects of the data engineering process are well-documented and accessible to your team and stakeholders.

Responsibilities

  • Implement scalable and efficient data architectures on GCP
  • Collaborate with cross-functional teams to understand data requirements and develop solutions that meet business needs
  • Build, test, and deploy data pipelines to move, transform, and process data from various sources to GCP
  • Ensure the reliability, scalability, and performance of data pipelines
  • Utilize GCP's big data technologies such as BigQuery, Dataflow, Dataprep, and Pub/Sub to implement effective data processing solutions
  • Monitor system performance and proactively optimize data pipelines for efficiency
  • Troubleshoot and resolve issues
  • Create and maintain comprehensive documentation for tools, architecture, processes, and solutions

Requirements

  • 5-7 years of experience in Java/Python development
  • Solid understanding of GCP services including Cloud Dataflow, Cloud Pub/Sub, BigQuery, Cloud Storage, and Google Composer
  • Strong SQL knowledge
  • Understanding of fundamentals of Git and Git workflows
  • Experience working in an agile application development environment
  • Ability to provide technical support to applications and troubleshoot software and application level issues
  • Ability to write and test programs using Unix Shell scripting and Oracle PL/SQL programming

Benefits

  • Competitive base salaries
  • Bonus incentives
  • 6% Company Match on retirement savings plan
  • Free financial coaching and financial well-being support
  • Comprehensive medical, dental, vision, life insurance, and disability benefits
  • Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need
  • 20+ weeks paid parental leave for all parents, regardless of gender, offered for pregnancy, adoption or surrogacy
  • Free access to global on-site wellness centers staffed with nurses and doctors (depending on location)
  • Free and confidential counseling support through our Healthy Minds program
  • Career development and training opportunities
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service