GCP Data Engineer

$100,000 - $125,000/Yr

Unclassified - Woonsocket, RI

posted 4 months ago

Full-time - Entry Level
Woonsocket, RI
5,001-10,000 employees

About the position

The GCP Data Engineer position is a critical role within a leading global IT services and consulting company that specializes in providing a wide range of services across various industries, including banking, financial services, retail, manufacturing, and healthcare. The company is recognized for its commitment to innovation and invests heavily in research and development to maintain its competitive edge in the technology sector. This role requires a deep understanding of Google Cloud architecture and a strong background in database and data warehouse technologies, particularly those offered by Google, such as Big Query, Cloud SQL, Spanner, and Big Table. As a GCP Data Engineer, you will be responsible for building and maintaining the infrastructure necessary for optimal extraction, transformation, and loading (ETL) of data from diverse data sources. This includes utilizing SQL and Google Cloud's big data technologies to create efficient data pipelines using tools like Data Proc Cluster, Data Flow, Pub/Sub, Cloud Composer (Airflow), and Cloud Functions. The position demands fluency in object-oriented programming languages, with a preference for Python, and a solid understanding of distributed systems architecture. Additionally, the role involves implementing data quality processes, which encompass data cleansing, audits, alerts, and triage mechanisms to ensure referential integrity. Familiarity with CI/CD practices, including release and deployment using Google Cloud Build, Git flow, GKE, and Docker, is also essential. This position is designed for individuals who are passionate about data engineering and are eager to contribute to the modernization of data warehousing and cloud-based data lakes.

Responsibilities

  • Understand and implement Google Cloud architecture.
  • Utilize database and data warehouse technologies from Google, including Big Query, Cloud SQL, Spanner, and Big Table.
  • Modernize enterprise-scale data modeling and data warehousing.
  • Build infrastructure for optimal extraction, transformation, and loading of data from various sources using SQL and Google Cloud technologies.
  • Develop data pipelines using Data Proc Cluster, Data Flow, Pub/Sub, Cloud Composer (Airflow), and Cloud Functions.
  • Write and maintain code in object-oriented languages, preferably Python.
  • Ensure data quality through cleansing, audits, alerts, and triage mechanisms.
  • Implement CI/CD processes for release and deployment using Google Cloud Build, Git flow, GKE, and Docker.

Requirements

  • Experience with Google Cloud architecture.
  • In-depth knowledge of database and data warehouse technologies and concepts from Google.
  • Experience in enterprise-scale data modeling and data warehouse modernization.
  • Proficiency in building infrastructure for ETL processes using SQL and Google Cloud technologies.
  • Experience in building data pipelines with Data Proc Cluster, Data Flow, Pub/Sub, Cloud Composer (Airflow), and Cloud Functions.
  • Fluency in object-oriented programming languages, preferably Python.
  • Strong understanding of distributed systems architecture.
  • Experience with data quality processes, including data cleansing and audits.
  • Familiarity with CI/CD practices, including Google Cloud Build and Docker.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service