Data Engineer, Finance Technology

$118,900 - $205,600/Yr

Amazon - Seattle, WA

posted 4 days ago

Full-time - Mid Level
Seattle, WA
Sporting Goods, Hobby, Musical Instrument, Book, and Miscellaneous Retailers

About the position

The Data Engineer position at Amazon Finance Technology (FinTech) involves building data engineering solutions that process massive volumes of data using AWS technologies. The role focuses on creating a robust data lake, optimizing financial resource allocation, and enhancing claims processing capabilities. The Data Engineer will work closely with software engineers and data experts to drive data-driven decision-making for finance and risk management, ensuring timely and accurate insights.

Responsibilities

  • Design, implement, and support data lake infrastructure using AWS big data stack, Python, Redshift, Quicksight, Glue/lake formation, EMR/Spark/Scala, Athena.
  • Interface with customers or business stakeholders, gathering requirements and delivering complete data solutions.
  • Model data and metadata to support ad-hoc and pre-built reporting.
  • Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.
  • Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.
  • Tune application and query performance using profiling tools and SQL.
  • Analyze and solve problems at their root, stepping back to understand the broader context.
  • Learn and understand a broad range of Amazon's data resources and know when, how, and which to use and which not to use.
  • Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets.
  • Triage possible courses of action in a high-ambiguity environment, and adopt quantitative analysis and business judgment when making design decisions.
  • Mentor other engineers, influence positively team culture, and help grow the team.

Requirements

  • 3+ years of data engineering experience
  • Experience with data modeling, warehousing and building ETL pipelines
  • Experience with SQL
  • Knowledge of batch and streaming data architectures like Kafka, Kinesis, Flink, Storm, Beam
  • Bachelor's degree
  • Proficiency in at least one modern programming language such as Java, Scala, or Python.

Nice-to-haves

  • Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions
  • Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
  • Master's degree

Benefits

  • Flexible working hours
  • Work-life balance emphasis
  • Diverse and inclusive workplace
  • Total compensation package including equity and sign-on payments
  • Full range of medical, financial, and other benefits
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service