Amazon.composted about 2 months ago
$118,900 - $205,600/Yr
Full-time • Mid Level
Seattle, WA
General Merchandise Retailers

About the position

Amazon Web Services has been the world's most comprehensive and broadly adopted cloud platform. AWS offers over 100 fully featured services to millions of active customers around the world-including the fastest-growing startups, largest enterprises, and leading government agencies-to power their infrastructure. Business Product & Operations (BPO) is a diverse team that supports infrastructure and other foundational initiatives that span and support Sales, Marketing, and Global Services Operations teams within AWS. We are looking for a hands-on Data Engineer with experience developing and delivering data platforms as we build the next iteration of our data-driven ecosystem, with a focus on enhancing and expanding our Phoenix data product. Come join a team at the forefront of transforming how AWS does business with its key customers. As a Data Engineer in AWS, you will partner with cross-functional teams, including Business Intelligence Engineers, Analysts, Software Developers, and Product Managers, to develop scalable and maintainable data pipelines on both structured and unstructured data. You will play a crucial role in achieving our ambitious objectives, including establishing Phoenix as AWS' premier order management and orchestration engine, transforming BPO into a fully data-driven organization, and centralizing BPO data assets while enabling self-serve analytics. The ideal candidate has strong business judgment, a good sense of architectural design, excellent written and documentation skills, and experience with big data technologies (Spark/Hive, Redshift, EMR, and other AWS technologies). This role involves overseeing existing pipelines as well as developing brand new ones to support key initiatives. You'll work on implementing comprehensive data governance frameworks, increasing adoption of advanced analytics and AI/ML tools, and migrating data and ETL processes to more efficient systems. Additionally, you'll contribute to implementing self-serve analytics platforms, optimizing data pipeline creation processes, and integrating data from multiple sources to support BPO's growing data needs. The operating environment is fast-paced and dynamic, with a strong team-oriented and welcoming culture. To thrive, you must be detail-oriented, enthusiastic, and flexible. In return, you will gain tremendous experience with the latest big data technologies and exposure to various use cases to improve process effectiveness, customer experience, and automation.

Responsibilities

  • Design and Develop ETL processes using AWS services such as AWS Glue, Lambda, EMR, and Step Functions, aiming to reduce pipeline creation time and improve efficiency
  • Implement and maintain a comprehensive data governance framework for Phoenix, ensuring data integrity, security, and compliance
  • Automate data monitoring, alerting, and incident response processes to ensure the reliability and availability of data pipelines, striving for near real-time data delivery
  • Collaborate with cross-functional teams including analysts, business intelligence engineers, and stakeholders to understand data requirements and design solutions that support BPO's transformation into a data-driven organization
  • Lead the development and implementation of a self-serve analytics platform, empowering both technical and non-technical users to drive their own analytics and reporting
  • Explore and implement advanced analytics and AI/ML tools to enhance data processing and insights generation capabilities
  • Stay up-to-date with the latest AWS data services, features, and best practices, recommending improvements to the data architecture to support BPO's growing data needs
  • Provide technical support and troubleshooting for issues related to data pipelines, data quality, and data processing, ensuring Phoenix becomes the trusted source of truth for AWS agreements and order management

Requirements

  • 3+ years of data engineering experience
  • Experience with SQL
  • Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
  • Experience with data modeling, warehousing and building ETL pipelines
  • Knowledge of distributed systems as it pertains to data storage and computing

Nice-to-haves

  • Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions
  • Experience with Apache Spark / Elastic Map Reduce
  • Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
  • Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
  • Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence
  • Experience working on and delivering end to end projects independently

Benefits

  • 401k
  • health_insurance
  • dental_insurance
  • vision_insurance
  • life_insurance
  • disability_insurance
  • paid_holidays
  • paid_volunteer_time
  • tuition_reimbursement
  • employee_stock_purchase_plan
  • flexible_scheduling
  • professional_development

Job Keywords

Hard Skills
  • AWS Glue
  • Java
  • NodeJs
  • Python
  • SQL
  • 0BnJw9WhjY 42HZ 9YEBuhPWvk
  • 18nqhXv7x3f wvGpUZCa36u
  • 24VePyFrj L9neDSf58bxB
  • 2tBAUuV6 st0AIeBrTXxk
  • 4HqsySV BjaxAuL3
  • 6vCpsXhgwoY9 JzDRK3hI81r
  • 7aQ2IB zyEPvKL5Ru
  • 9CdnrVkGe 9TPQ2 a7iVFOIj
  • 9l7Tqm2Df RsG37uyCq0
  • AxC4 hc3w1xfTG
  • axkqCs0gp Pkg35xLhNmep
  • BdFrwC vnbg73DOcJr
  • BJu3hN 3Hy9ZpEv
  • Bolsz XrE3ZPKs
  • cx0S1 hDB5Tox8yl
  • dpDLQ EAmZTira1R8
  • DWgEJ A7RfyJFdCo
  • eR1li v0J1slhxW3D
  • FfdzL IwRWZfmUSHG1
  • fLZ5ogYnKTjh zWX897rMma
  • Hl9K pSJom5sQE
  • IHuqkJgnU cWRPsC1OvTlIf
  • itFq0 azv29ZFbU6LpH
  • Jk02n oWAje8KtLJ
  • KYlxV EDpmk9GSd4O
  • lMC8k IF6jCsa
  • mJlyBa
  • nax75YL 9uPkdl
  • NeDCL 2HO5k1vDLu
  • nXuac RLBzqZdi2N9a
  • QFDjVtwkqNB9y5 8juY1Vp
  • qnGWYu3Z6 Awjhz2UdsoR5E
  • TKHNLRhzU TsS9fVl
  • UB7oAL apwQWljLvP8
  • uCAyN pdA4Y m1FzJ
  • vclV6hk5fz PR608NrJ
  • wA0v8PxITKZ bPBE3g25IWM
  • wbG4 1V6h8
  • wZUFfYygk 51eXL2x8D
  • X2mKTziytuBfDCoUQ NUhPsM2z7JD8Ox
  • X3FhjLY6V BhcZXzNCF
  • X53y2ZolP 3WpXPvb0Nj7YK
  • Xcmb8 c1LTpU5Ib
  • XU6fC 630KTZI9mh2x
  • yDjOYvxtV4de3 bi467
  • YGFgw Bl4L1gAUM5
Soft Skills
  • vxBrM 91LwR2NDX
  • Vz5Jerl UVp25QS46
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service