Viva Tech Solutions LTD - Los Angeles, CA

posted 4 days ago

Full-time - Senior
Los Angeles, CA

About the position

The Lead Data Architect (AWS) role involves designing and implementing data solutions using AWS services. The position requires a strong background in data architecture, with a focus on creating scalable, secure, and cost-effective data storage and processing solutions. The architect will work on building Lakehouse solutions, ETL pipelines, and ensuring data governance and security while collaborating with various stakeholders to meet their data needs.

Responsibilities

  • Designing and implementing data solutions using AWS services like S3, Glue, EMR, and Kinesis.
  • Creating blueprints for data storage, processing, and access considering performance, scalability, security, and cost-effectiveness.
  • Building Lakehouse solutions on AWS using Lambda, EMR, and EKS, and implementing ETL pipelines for data transformation.
  • Designing data access for querying with Athena and integrating Iceberg Tables in Glue Catalog with Snowflake for analytics and reporting.
  • Working with large datasets and distributed computing using EMR and streaming solutions with Kinesis and Kafka.
  • Ensuring data quality and compliance with regulations like GDPR, and implementing security measures for sensitive information.
  • Collaborating with stakeholders to understand their needs and translating them into technical solutions.

Requirements

  • Minimum of 12 years of overall experience in data architecture.
  • At least 2-3 years of experience as an Architect is required.
  • Deep understanding of core AWS services (S3, EC2, EKS, EMS, VPC, IAM, Glue, Athena).
  • Strong knowledge of dimensional modelling, schema design, and data warehousing principles.
  • Experience with tools and techniques for data extraction, transformation, and loading (ETL).
  • Familiarity with big data technologies such as Hadoop, Spark, and Hive.
  • Proficiency in SQL and experience with relational databases; NoSQL experience (like DynamoDB) is a plus.
  • Hands-on programming experience in Python with PySpark for automation and data processing tasks.
  • Understanding of data security best practices, access control, and compliance requirements.

Nice-to-haves

  • AWS certifications (e.g., Solutions Architect, Big Data Specialty) are highly valued.
  • Experience with NoSQL databases like DynamoDB.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service