Snowflake Developer

$124,800 - $124,800/Yr

Plaxonic Technologies - Coppell, TX

posted 4 months ago

Full-time
Coppell, TX

About the position

The Snowflake Developer position at Plaxonic Technologies is a full-time role based in Coppell, TX, focused on leveraging advanced data management and analytical skills to support the company's data warehousing needs. The ideal candidate will have a minimum of 8 years of experience in data management, with a specialization in analytical data warehouses. A significant portion of this experience should include at least 3 years of hands-on work with Snowflake cloud warehouse and its features. The role requires a deep understanding of cloud AWS architecture and its services, as well as proficiency in creating and analyzing SQL and PL/SQL procedures for data integration. In this position, the developer will be responsible for developing ETL pipelines that facilitate the movement of data in and out of the data warehouse, utilizing a combination of Python and Snowflake SQL. The role also involves scripting with Unix shell to extract and load data efficiently. A solid grasp of logical and physical data models is essential to support analytics and business intelligence initiatives. The developer will create SQL-based processes to build slowly changing dimensions and various types of facts, ensuring data integrity and accuracy throughout the data lifecycle. The position also encompasses a range of data management tasks, including Data Lineage Analysis, Data Profiling, Mapping Integration, ETL Validation, Cleansing, Masking, Subsetting, Profiling, Archiving, Purging, Virtualization, and Metadata Management. The developer will perform unit tests, support system integration testing, and assist with user acceptance testing (UAT). SQL performance tuning and troubleshooting are critical components of the role, as is the ability to identify and resolve technical issues effectively. The candidate will work within an Agile project framework and utilize Jira for documentation and project management. Strong verbal and written communication skills are essential for collaborating with team members and stakeholders.

Responsibilities

  • Develop ETL pipelines for data integration using Python and Snowflake SQL.
  • Create and analyze SQL and PL/SQL procedures for data integration.
  • Develop scripts using Unix shell for data extraction and loading.
  • Engage in Data Lineage Analysis and Data Profiling.
  • Perform ETL Validation, Cleansing, Masking, and Subsetting.
  • Create SQL-based processes for slowly changing dimensions and various types of facts.
  • Conduct unit tests and support system integration testing and UAT.
  • Engage in SQL performance tuning and troubleshooting.
  • Document project progress and issues using Jira.
  • Collaborate with team members in an Agile project environment.

Requirements

  • Minimum 8 years of experience in Data Management with a focus on analytical data warehouses.
  • At least 3 years of hands-on experience with Snowflake cloud warehouse and its features.
  • Good knowledge of AWS architecture and services.
  • Experience in creating and analyzing SQL and PL/SQL procedures for data integration.
  • Proficiency in developing ETL pipelines using Python and Snowflake SQL.
  • Experience in scripting with Unix shell for data extraction and loading.
  • Strong understanding of logical and physical data models for analytics and business intelligence.
  • Experience in Data Lineage Analysis, Data Profiling, and Mapping Integration.
  • Ability to perform ETL Validation, Cleansing, Masking, and Subsetting.
  • Experience in SQL performance tuning and troubleshooting.

Nice-to-haves

  • Familiarity with Agile methodology and project management tools like Jira.
  • Experience with data warehousing best practices and methodologies.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service