Data Analytics Engineer

$160,000 - $190,000/Yr

Motion Recruitment - Arlington, TX

posted 3 months ago

Full-time - Mid Level
Remote - Arlington, TX
Administrative and Support Services

About the position

The Data Analytics Engineer position is a critical role focused on supporting the data infrastructure and software requirements of the organization. This fully remote position requires candidates to be located in the Eastern Time Zone. The primary responsibility of the Data Analytics Engineer will be to design, implement, and maintain ETL (Extract, Transform, Load) processes and workflows that provide valuable insights to drive business growth. The role involves working closely with cross-functional teams to gather and understand data requirements, ensuring that the data infrastructure aligns with the business objectives and enhances decision-making capabilities. In this position, you will leverage your expertise in programming languages such as Go and Python to develop robust data pipelines and workflows. You will also be responsible for monitoring and optimizing ETL processes to ensure efficiency and reliability. A solid understanding of data modeling concepts and techniques is essential, as you will be tasked with designing data structures that support analytical needs. Additionally, experience with relational databases, particularly PostgreSQL and MySQL, is crucial for managing and querying data effectively. The Data Analytics Engineer will also work with modern data technologies, including Apache Kafka for event-driven architectures and Kubernetes for container orchestration. Your ability to collaborate with various teams will be key to successfully gathering data requirements and implementing solutions that meet the needs of the organization. This role offers an exciting opportunity to contribute to the company's data strategy and drive impactful business outcomes through data-driven insights.

Responsibilities

  • Design, implement, and maintain ETL processes and workflows.
  • Monitor and optimize ETL workflows for efficiency and reliability.
  • Collaborate with cross-functional teams to gather and understand data requirements.
  • Develop robust data pipelines using Go and Python.
  • Manage and query data using SQL and MySQL.
  • Work with relational databases such as PostgreSQL and MySQL.
  • Utilize Apache Kafka for event-driven architectures.
  • Implement container orchestration using Kubernetes.

Requirements

  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • Minimum of 8 years of experience in Data Engineering or Software Development.
  • Expert knowledge of Go and Python.
  • Strong proficiency in SQL and MySQL.
  • Experience with relational databases such as PostgreSQL and MySQL.
  • Experience with Apache Kafka and event-driven architectures.
  • Experience with Kubernetes (K8s) and containers.
  • Solid understanding of data modeling concepts and techniques.

Nice-to-haves

  • Experience with ELK (Elasticsearch, Logstash, Kibana).
  • Experience with Jasper Reports.
  • Knowledge of Druid.
  • Proficiency in Apache Superset.

Benefits

  • Medical
  • Dental
  • Vision
  • 401k
  • Generous PTO
  • Training & Development
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service