Paladin Consulting - Richardson, TX

posted about 1 month ago

Full-time - Mid Level
Richardson, TX
Professional, Scientific, and Technical Services

About the position

The Snowflake/Snaplogic Developer role is a senior data engineering position focused on designing, developing, and maintaining scalable data solutions using Snowflake and Snaplogic. The position requires expertise in data extraction, transformation, and loading (ETL), as well as strong analytical skills to support complex data workflows and ensure data quality. The developer will work closely with various stakeholders to integrate data from multiple sources and optimize data delivery processes.

Responsibilities

  • Plan & analyze, develop, maintain, and enhance client systems of moderate to high complexity.
  • Participate in the design, specification, implementation, and maintenance of systems.
  • Design, code, test, and document software programs of moderate complexity as per requirement specifications.
  • Design, develop, and maintain scalable data pipelines using Snowflake, dbt, Snaplogic, and ETL tools.
  • Participate in design reviews and technical briefings for specific applications.
  • Integrate data from various sources, ensuring consistency, accuracy, and reliability.
  • Develop and manage ETL/ELT processes to support data warehousing and analytics.
  • Assist in preparation of requirement specifications, analyzing data, and developing data-driven applications including documenting and revising user procedures and/or manuals.
  • Resolve medium to severe complexity software development issues that may arise in a production environment.
  • Utilize Python for data manipulation, automation, and integration tasks.
  • Assemble large, complex data sets that meet functional/non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, redesigning infrastructure for greater scalability.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL Server, PostgreSQL, SSIS, T-SQL, PL/SQL.
  • Work with stakeholders including Product, Data, Design, Frontend, and Backend teams to assist with data-related technical issues and support their data infrastructure needs.
  • Write complex SQL, T-SQL, PL/SQL queries, stored procedures, functions, cursors in SQL Server and PostgreSQL. Peer review other team members' code.
  • Analyze long-running queries/functions/procedures, design and develop performance optimization strategies.
  • Create and manage SSIS packages and/or Informatica to perform day-to-day ETL activities. Use a variety of strategies for complex data transformations using an ETL tool.
  • Perform DBA activities like maintaining system health and performance tuning, managing database access, deployments to higher environments, on-call support, shell scripting, and Python scripting as a plus.
  • Participate in employing Continuous Delivery and Continuous Deployment (CI/CD) tools for optimal productivity.
  • Collaborate with scrum team members during daily standup and actively engage in sprint refinement, planning, review, and retrospective.
  • Analyze, review, and alter programs to increase operating efficiency or adapt to new requirements.
  • Write documentation to describe program development, logic, coding, and corrections.

Requirements

  • Bachelor's degree (BA/BS) in a related field such as information systems, mathematics, or computer science.
  • Typically has 6 years of relevant work experience, with consideration given to equivalent combinations of education and experience.
  • Excellent written and verbal communication skills.
  • Strong organizational and analytical skills.
  • Expertise in Data Extraction, Transformation, Loading, Data Analysis, Data Profiling, and SQL Tuning.
  • Expertise in Relational & Dimensional Databases in engines like SQL Server, Postgres, Oracle.
  • Strong experience in designing and developing enterprise-scale data warehouse systems using Snowflake.
  • Strong expertise in designing and developing reusable and scalable Data products with data quality scores and integrity checks.
  • Strong expertise in developing end-to-end complex data workflows using data ingestion tools such as Snaplogic, ADF, Matallion, etc.
  • Experience with cloud platforms AWS/Azure cloud technologies, Agile methodologies, and DevOps is a big plus.
  • Experience in architecting cloud-native solutions across multiple B2B and B2B2C data domains.
  • Experience in architecture of modern APIs for secure sharing of data across internal application components as well as external technology partners.
  • Experience in Data orchestration tools like Apache Airflow, Chronos with Mesos cluster, etc.
  • Expertise in designing and developing data transformation models in DBT.
  • Knowledge of Python for data manipulation and automation.
  • Knowledge of data governance frameworks and best practices.
  • Knowledge in integrating with source code versioning tools like Git Hub.

Nice-to-haves

  • Experience with developing CI/CD pipelines in Jenkins or Azure DevOps.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service