Infinity Tech Group - Charlotte, NC

posted about 2 months ago

Full-time - Senior
Charlotte, NC
Professional, Scientific, and Technical Services

About the position

The Lead Data Engineer position is a critical role within a financial client environment, requiring a seasoned professional with extensive experience in data engineering, particularly with AWS and Snowflake technologies. This hybrid role necessitates the candidate to be onsite 1-2 days a week, with the primary locations being Charlotte, NC, or Chicago, IL. The ideal candidate will have a robust background in managing and engineering data solutions that support complex business requirements and analytics. In this role, the Lead Data Engineer will be responsible for developing and implementing data models and pipelines that facilitate the processing and analysis of large datasets. The candidate will work closely with business stakeholders to understand their data needs and translate those into actionable data solutions. This position demands a strong analytical mindset, as the engineer will be expected to identify patterns and anomalies within the data, providing insights that drive business decisions. The successful candidate will have a minimum of 12-15 years of related experience, with at least 5 years specifically focused on AWS services, including but not limited to Lambda, Gateway, SNS, Firehose, and others. Proficiency in Snowflake and experience with data processing frameworks such as Spark, AWS Redshift, AWS EMR, and AWS Glue is essential. Familiarity with modern programming languages like Python, Java, Scala, or NodeJS, as well as SQL and reporting tools like Quicksight or Tableau, will be crucial for success in this role. A four-year engineering degree is required, with additional certifications in AWS or a Master's degree considered advantageous.

Responsibilities

  • Develop and implement complex data models and pipelines for large-scale systems.
  • Collaborate with business owners to understand data requirements and provide actionable insights.
  • Analyze large datasets to identify patterns and anomalies that inform business decisions.
  • Utilize AWS services such as Lambda, Redshift, and Glue to build and maintain data solutions.
  • Work with Snowflake to manage and optimize data storage and retrieval processes.
  • Employ Spark and AWS EMR for data processing and transformation tasks.
  • Write and maintain scripts in programming languages such as Python, Java, Scala, or NodeJS.
  • Create and manage reporting solutions using SQL and tools like Quicksight or Tableau.

Requirements

  • 12-15 years of related experience in data engineering.
  • 5+ years of experience with AWS services including Lambda, Gateway, SNS, Firehose, etc.
  • Strong experience in Spark, AWS Redshift, AWS EMR, AWS Glue, and Terraform.
  • Proficiency in at least one modern scripting or programming language (Python, Java, Scala, NodeJS).
  • Good knowledge of SQL and reporting solutions like Quicksight or Tableau.
  • Experience with Snowflake for data management and analytics.
  • Ability to develop complex data models and implement them across large-scale systems.
  • Strong analytical skills and experience working with large complex datasets.
  • Excellent communication skills to partner with business owners and understand their data needs.

Nice-to-haves

  • AWS certification is a plus.
  • Master's degree is a plus.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service