Data Engineering Architect

$104,000 - $159,000/Yr

Anblicks - Dallas, TX

posted 29 days ago

Full-time - Mid Level
Onsite - Dallas, TX
1-10 employees

About the position

The Data Engineering Architect at Anblicks is responsible for enhancing data architecture capabilities and providing technical leadership in resolving complex data-related issues. This role involves managing data integration projects, ensuring alignment with business objectives, and designing data solutions on the AWS cloud platform. The architect will work with various technologies to optimize data warehouse solutions and develop ETL pipelines, contributing to the overall data strategy of the organization.

Responsibilities

  • Evaluate new technologies and tools to enhance data architecture capabilities.
  • Provide technical leadership and expertise in resolving complex data-related issues.
  • Manage data integration projects and coordinate with cross-functional teams.
  • Define and enforce data architecture standards, principles, and guidelines.
  • Design and implement data solutions on the AWS cloud platform.
  • Develop hive tables using partitioning and bucketing.
  • Provision AWS infrastructure services and automate infrastructure changes using Lambda functions.
  • Utilize AWS Guard Duty for threat detection and remediation.
  • Involve in AWS migration projects to implement a hybrid cloud environment.
  • Utilize Ali Cloud big data tools to create data cubes and queries.
  • Work on MongoDB and Redshift databases to extract data from AWS to Ali Cloud.
  • Build real-time pipelines using tt jobs and Blink SQL.
  • Architect and optimize data warehouse solutions using Snowflake.
  • Use Python and PySpark for data processing and automation tasks.
  • Develop ETL pipelines using Python and Snowflake.
  • Design and implement data solutions using Snowflake's cloud data platform.
  • Automate, schedule, and monitor jobs with the Build-forge Scheduling tool.
  • Develop ETL workflows using Informatica PowerCenter.
  • Design and optimize Teradata databases for efficient storage and retrieval.
  • Manage Hadoop clusters for big data processing and analytics.

Requirements

  • Bachelor's degree in computer science, Computer Information Systems, or Engineering related fields plus 5 years of progressively responsible experience.
  • Master's degree in computer science, Computer Information Systems, or Engineering related fields plus 2 years of experience is acceptable.
  • Experience should include at least 2 years working on AWS Cloud (S3, Redshift, EC2, RDS, Amazon Kinesis Streams, Data Firehose, AWS Guard Duty, Lambda), Big Data, Ali Cloud, Snowflake (SnowSQL, Snowpipe), MongoDB, Python, PySpark, Hive, Informatica, PL/SQL.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service