Data Engineer

$65,000 - $140,000/Yr

Interworks - Stillwater, OK

posted 2 months ago

Full-time
Remote - Stillwater, OK
Construction of Buildings

About the position

As a Data Engineer at InterWorks, you will be responsible for designing, developing, and implementing robust data infrastructure and pipelines that empower clients to make data-driven decisions. This role involves tackling diverse projects for a range of clients, from local businesses to Fortune 500 companies, while working closely with a supportive team to deliver high-quality solutions.

Responsibilities

  • Tackle diverse projects that range in duration from a few days to a few months for clients ranging from local businesses to the Fortune 500
  • Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines for Data Lake and Data Warehouse
  • Design cloud-native data pipelines, automation routines, and database schemas that can be leveraged to do predictive and prescriptive machine learning
  • Communicate ideas clearly, both verbally and through concise documentation, to various business sponsors, business analysts and technical resources
  • Build and implement ETL frameworks to improve code quality and reliability
  • Build and enforce common design patterns to increase code maintainability
  • Work with disparate data sources (relational databases, flat files, Excel, HDFS/Big Data systems, high-performance analytical databases, etc.) to unify client data

Requirements

  • Excellent SQL fluency
  • Strong ETL proficiency using GUI-based tools or code-based patterns
  • Understanding of data-modeling principles
  • Passion for delivering compelling solutions that exceed client expectations
  • Excellent verbal and written communication
  • Strong problem-solving skills
  • Business acumen
  • A thirst to learn
  • Adaptability and flexibility in changing situations

Nice-to-haves

  • Experience with software engineering practices
  • Experience with modern data-engineering practices and frameworks
  • Experience with integration from semi-structured file and API sources
  • Matillion, Fivetran, DBT or other ETL tools
  • AWS / Microsoft Azure cloud exposure
  • Snowflake / Databricks / Amazon Redshift / Google BigQuery / Azure Synapse
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service