Empower Professionals - Los Angeles, CA

posted 11 days ago

Full-time - Senior
Los Angeles, CA
Professional, Scientific, and Technical Services

About the position

The AWS Data Architect role involves designing and implementing data solutions using AWS services. The position requires a strong understanding of data architecture, ETL processes, and big data technologies, with a focus on creating scalable and secure data storage and processing solutions. The role is hybrid, requiring in-office presence three days a week in Woodland Hills, CA, and is a contract position lasting over 12 months.

Responsibilities

  • Designing and implementing data solutions using AWS services like S3, Glue, EMR, and Kinesis.
  • Creating blueprints for data storage, processing, and access, considering performance, scalability, security, and cost-effectiveness.
  • Building Lakehouse solutions on AWS using Lambda, EMR, and EKS, and implementing ETL pipelines to move and transform data.
  • Designing data access for querying with Athena and integrating Iceberg Tables in Glue Catalog with Snowflake for analytics and reporting.
  • Processing large datasets and distributed computing using EMR with Spark and streaming solutions with Kinesis and Kafka.
  • Ensuring data quality and compliance with regulations like GDPR, and implementing security measures to protect sensitive information.
  • Collaborating with stakeholders to understand their needs and translating them into technical solutions.

Requirements

  • 10+ years of experience in data architecture and related fields.
  • Deep understanding of core AWS services (S3, EC2, EKS, EMS, VPC, IAM, Glue, Athena).
  • Strong knowledge of dimensional modeling, schema design, and data warehousing principles.
  • Experience with ETL tools and techniques for data extraction, transformation, and loading.
  • Familiarity with big data technologies such as Hadoop, Spark, and Hive.
  • Proficiency in SQL and experience with relational databases; NoSQL experience (like DynamoDB) is a plus.
  • Hands-on coding experience in Python with PySpark for automation and data processing tasks.
  • Understanding of data governance, security best practices, and compliance requirements.

Nice-to-haves

  • AWS certifications (e.g., Solutions Architect, Big Data Specialty).
  • Experience with NoSQL databases like DynamoDB.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service