Enterprise Engineering - Dallas, TX

posted 4 months ago

Full-time
Dallas, TX
Professional, Scientific, and Technical Services

About the position

We are seeking a highly skilled and experienced Google Cloud Platform Data Engineer to join our team. The ideal candidate will have a strong background in Teradata migration to Google Cloud Platform (GCP). You will be responsible for designing, implementing, and optimizing data solutions on GCP, ensuring seamless data migration. This role requires a deep understanding of data engineering principles and practices, particularly in the context of cloud-based environments. You will work closely with cross-functional teams to ensure that data solutions meet the needs of the organization and are aligned with best practices in data management and governance. In this position, you will lead the migration of Teradata data warehouses to Google Cloud Platform, ensuring data integrity, security, and minimal downtime. You will develop and implement data migration strategies, including ETL processes, data validation, and performance tuning. A strong proficiency in SQL is essential, as you will be required to write complex queries and optimize them for performance. Additionally, you will design, develop, and maintain data pipelines and ETL processes using GCP services such as Airflow, Dataflow, BigQuery, and Pub/Sub. Your ability to design cost-effective solutions in GCP will be critical to the success of our data initiatives. You will also be responsible for optimizing and automating data workflows to ensure high performance and scalability. Implementing data quality checks and monitoring will be part of your duties to ensure the accuracy and reliability of data. Collaboration with data analysts, data scientists, and other stakeholders will be key to understanding data needs and delivering high-quality solutions. Furthermore, you will provide technical guidance and mentorship to junior data engineers and team members, documenting data engineering processes, best practices, and technical specifications to foster a culture of knowledge sharing and continuous improvement.

Responsibilities

  • Lead the migration of Teradata data warehouse to Google Cloud Platform, ensuring data integrity, security, and minimal downtime.
  • Develop and implement data migration strategies, including ETL processes, data validation, and performance tuning.
  • Collaborate with cross-functional teams to understand data requirements and ensure successful data migration.
  • Design, develop, and maintain data pipelines and ETL processes using Google Cloud Platform services such as Airflow, Dataflow, BigQuery, and Pub/Sub.
  • Optimize and automate data workflows to ensure high performance and scalability.
  • Implement data quality checks and monitoring to ensure accuracy and reliability of data.
  • Provide technical guidance and mentorship to junior data engineers and team members.
  • Document data engineering processes, best practices, and technical specifications.

Requirements

  • Minimum of 5+ years of experience in Python or any scripting language and SQL.
  • Strong proficiency in SQL, with the ability to write complex queries and optimize them for performance.
  • Experience in data warehouse environments, with a preference for candidates with Teradata experience.
  • Ability to design cost-effective solutions in Google Cloud Platform.
  • Strong understanding of data migration strategies and ETL processes.

Nice-to-haves

  • Past experience in Teradata or in other Datawarehouse.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service