Data Engineer II

$124,800 - $135,200/Yr

The Judge Group - Los Angeles, CA

posted about 2 months ago

Full-time - Mid Level
Remote - Los Angeles, CA
Administrative and Support Services

About the position

Our client is currently seeking a Data Engineer II to join their team. This position is fully remote and offers an hourly salary ranging from $60.00 to $65.00 USD. The ideal candidate will possess strong SQL skills, which are essential for this role. The Data Engineer will be responsible for designing, developing, and maintaining data pipelines and integration processes to ensure data is accessible and usable for analytics and reporting purposes. The role requires experience with one or more data integration tools such as Matillion, SQL SSIS, or DBT. The candidate should have a solid understanding of SQL Server and Stored Procedures, as well as familiarity with Snowflake database loading. Basic knowledge of Python scripting is also required, as it will be used to automate tasks and enhance data processing workflows. Additionally, familiarity with AWS cloud file storage, specifically S3 buckets, is necessary for managing data storage solutions. The Data Engineer will also be expected to call APIs for data retrieval and mapping results, which is considered a good-to-have skill. The position demands a proactive approach to problem-solving and the ability to work independently in a remote environment. Candidates should have a strong background in SQL, with at least 8 years of experience, and a minimum of 2 years of experience with data integration tools and SQL Server. This is an excellent opportunity for a skilled Data Engineer looking to advance their career in a dynamic and supportive environment.

Responsibilities

  • Design, develop, and maintain data pipelines and integration processes.
  • Utilize strong SQL skills to manage and manipulate data effectively.
  • Work with data integration tools such as Matillion, SQL SSIS, or DBT.
  • Implement and manage SQL Server and Stored Procedures for data operations.
  • Load data into Snowflake databases and ensure data integrity.
  • Write basic Python scripts to automate data processing tasks.
  • Utilize AWS cloud file storage solutions, particularly S3 buckets, for data management.
  • Call APIs for data retrieval and map results as needed.

Requirements

  • Strong SQL skills with 8+ years of experience.
  • Experience with SQL Server and Stored Procedures for at least 2 years.
  • Proficiency in one or more data integration tools (Matillion, SQL SSIS, DBT) for at least 2 years.
  • Basic knowledge of Python scripting (1+ year experience).
  • Familiarity with Snowflake database loading (1+ year experience).
  • Knowledge of AWS cloud file storage, specifically S3 buckets.

Nice-to-haves

  • Experience calling APIs for data retrieval and mapping results.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service