Isagenix - Gilbert, AZ

posted 1 day ago

Full-time
Hybrid - Gilbert, AZ
Health and Personal Care Retailers

About the position

The Data Engineer will be responsible for building and maintaining enterprise data management infrastructure both on-premises and in the cloud. The Data Engineer will be responsible for orchestrating pipelines using modern data tools/architectures as well as design and engineering of existing transactional and RDBMS processing systems. This position currently follows a hybrid work model. Employees are required to work from the office at least four days per week (Monday - Thursday), with Friday available for remote work, offering a blend of in-person collaboration and flexibility.

Responsibilities

  • Develop and maintain data pipelines that extract, transform, and load data into an information and analytics environment
  • Develop and maintain datasets in a conventional data warehouse (operational data store, dimensional models)
  • Develop and maintain datasets in a modern, cloud-based data warehouse (Redshift, Snowflake, Azure)
  • Implement and configure data sets on column oriented (column-store) database management systems
  • Assist application development teams during application design and development for highly complex and critical data projects
  • Create and enhance analytic and data platforms using tools that enable state of the art, next generation capabilities and applications
  • Utilize programming languages like SQL, Java, Spark, Python, with an emphasis in tuning, optimization, and best practices for application developers
  • Function as team member in an Agile development environment
  • Leverage DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of end user capabilities
  • Develop data solutions on cloud deployments such as AWS and Azure
  • Understand concepts of data ingestion from event driven architectures
  • Remain on the cusp of new technologies in Data Engineering, Big Data and Analytics, and Cloud space

Requirements

  • Bachelor's degree from an accredited institution or equivalent experience
  • 4+ years experience connecting to various data sources and structures: APIs, NoSQL, RDBMS, Blob Storage, Data Lake, etc.
  • Knowledge of cloud providers such as AWS, Azure, and Snowflake
  • Database systems (SQL and NoSQL)
  • ETL tools including SSIS, Glue, and Kinesis Firehose
  • Data APIs
  • Python, PowerShell
  • Experience with multiplatform integration and distributed systems (Kafka)

Nice-to-haves

  • Experience with Dev Ops tools such as Azure DevOps and Visual Studio
  • Master Data Management (MDM)
  • Understanding of data-warehousing and data-modeling techniques
  • Advanced level SQL for data transformations, queries & data modeling
  • Data Quality tools and processes
  • Graph Database Systems including Neo4J
  • Cypher Query Language
  • Data warehousing solutions
  • T-SQL, MDX, and Spark programming languages
  • Prior knowledge of modern Cloud based ETL tools such as Matillion or Fivetran
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service