Hired By Matrix - Newark, NJ

posted 5 days ago

Full-time - Mid Level
Newark, NJ
Administrative and Support Services

About the position

The Data Engineer - Special Skillset (AI/ML, AWS) position is designed for individuals looking to advance their careers in a global financial company. This role focuses on building and maintaining data solutions using AWS technologies, particularly in the areas of AI and machine learning. The successful candidate will be responsible for designing efficient data architectures, implementing data ingestion pipelines, and ensuring high performance in data engineering projects.

Responsibilities

  • Designing, building and maintaining efficient, reusable, and reliable architecture and code.
  • Build reliable and robust Data ingestion pipelines (within AWS, on-prem to AWS, etc).
  • Ensure the best possible performance and quality of high scale data engineering projects.
  • Participate in the architecture and system design discussions.
  • Independently perform hands-on development and unit testing of the applications.
  • Collaborate with the development team and build individual components into complex enterprise web systems.
  • Work in a team environment with product, production operation, QE/QA and cross-functional teams to deliver a project throughout the whole software development cycle.
  • Responsible to identify and resolve any performance issues.
  • Keep up to date with new technology development and implementation.
  • Participate in code review to make sure standards and best practices are met.

Requirements

  • Bachelor's degree in Computer Science, Software Engineering, MIS or equivalent combination of education and experience.
  • Experience implementing, supporting data lakes, data warehouses and data applications on AWS for large enterprises.
  • Programming experience with Python, Shell scripting and SQL.
  • Solid experience of AWS services such as CloudFormation, S3, Athena, Glue, EMR/Spark, RDS, Redshift, DynamoDB, Lambda, Step Functions, IAM, KMS, SM etc.
  • Solid experience implementing solutions on AWS based data lakes.
  • Good experience with AWS Services - API Gateway, Lambda, Step Functions, SQS, DynamoDB, S3, ElasticSearch.
  • Serverless application development using AWS Lambda.
  • Experience in AWS data lake/data warehouse/business analytics.
  • Experience in system analysis, design, development, and implementation of data ingestion pipeline in AWS.
  • Knowledge of ETL/ELT.
  • End-to-end data solutions (ingest, storage, integration, processing, access) on AWS.
  • Architect and implement CI/CD strategy for EDP.
  • Implement high velocity streaming solutions using Amazon Kinesis, SQS, and Kafka (preferred).
  • Migrate data from traditional relational database systems, file systems, NAS shares to AWS relational databases such as Amazon RDS, Aurora, and Redshift.
  • Migrate data from APIs to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift.
  • Implement POCs on any new technology or tools to be implemented on EDP and onboard for real use-case.

Nice-to-haves

  • AWS Solutions Architect or AWS Developer Certification preferred.
  • Good understanding of Lakehouse/data cloud architecture.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service