Securekloud Technologies - Minneapolis, MN
posted 2 months ago
The AWS Data Engineer with Snowflake position at SecureKloud is a long-term onsite role based in Minneapolis, MN, specifically targeting local candidates. This role requires a seasoned professional with over 10 years of experience in data engineering, particularly with a strong focus on Snowflake and AWS services. The successful candidate will be responsible for designing, developing, and maintaining robust data pipelines and ETL processes utilizing Snowflake, AWS services, Python, and DBT. In this role, collaboration is key, as the engineer will work closely with data scientists and analysts to understand their data requirements and implement effective solutions. The engineer will also be tasked with optimizing data workflows to ensure performance, scalability, and reliability. Troubleshooting and resolving data-related issues promptly is another critical aspect of this position. Staying updated on the latest technologies and best practices in data engineering is essential to maintain a competitive edge in this rapidly evolving field. The ideal candidate will possess a deep understanding of the Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics. Experience with the DBT Data Build Tool for modeling data and creating transformation pipelines is highly desirable. Additionally, hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows, AWS Lambda for serverless computing, and AWS Glue for ETL processes, is crucial. Strong programming skills in Python are required for developing data pipelines, transformations, and automation tasks. The candidate should also demonstrate proven experience in data engineering roles, strong analytical and problem-solving skills, and excellent communication and teamwork abilities.