Securekloud Technologies - Minneapolis, MN

posted 2 months ago

Full-time - Mid Level
Minneapolis, MN
Publishing Industries

About the position

The AWS Data Engineer with Snowflake position at SecureKloud is a long-term onsite role based in Minneapolis, MN, specifically targeting local candidates. This role requires a seasoned professional with over 10 years of experience in data engineering, particularly with a strong focus on Snowflake and AWS services. The successful candidate will be responsible for designing, developing, and maintaining robust data pipelines and ETL processes utilizing Snowflake, AWS services, Python, and DBT. In this role, collaboration is key, as the engineer will work closely with data scientists and analysts to understand their data requirements and implement effective solutions. The engineer will also be tasked with optimizing data workflows to ensure performance, scalability, and reliability. Troubleshooting and resolving data-related issues promptly is another critical aspect of this position. Staying updated on the latest technologies and best practices in data engineering is essential to maintain a competitive edge in this rapidly evolving field. The ideal candidate will possess a deep understanding of the Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics. Experience with the DBT Data Build Tool for modeling data and creating transformation pipelines is highly desirable. Additionally, hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows, AWS Lambda for serverless computing, and AWS Glue for ETL processes, is crucial. Strong programming skills in Python are required for developing data pipelines, transformations, and automation tasks. The candidate should also demonstrate proven experience in data engineering roles, strong analytical and problem-solving skills, and excellent communication and teamwork abilities.

Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using Snowflake, AWS services, Python, and DBT.
  • Collaborate with data scientists and analysts to understand data requirements and implement solutions.
  • Optimize data workflows for performance, scalability, and reliability.
  • Troubleshoot and resolve data-related issues in a timely manner.
  • Stay updated on the latest technologies and best practices in data engineering.

Requirements

  • Deep understanding of Snowflake data warehousing platform.
  • Proficiency in using Snowpark for data processing and analytics.
  • Experience with DBT Data Build Tool for modeling data and creating data transformation pipelines.
  • Hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows.
  • Proficiency in AWS Lambda for serverless computing and event-driven architecture.
  • Well-versed in AWS Glue for ETL processes and data integration.
  • Strong programming skills in Python for developing data pipelines and automation tasks.
  • Proven experience in data engineering roles with a focus on Snowflake, AWS services, Python, and DBT.
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service