People Tech Group - Redmond, WA

posted 2 months ago

Full-time - Senior
Redmond, WA
Professional, Scientific, and Technical Services

About the position

The Data Architect/Lead Developer role at People Tech Group is a pivotal position focused on architecting and building enterprise-scale data platforms in a green field environment. This position requires a deep understanding of cloud technologies, particularly Google Cloud Platform (GCP), and the ability to design and implement end-to-end data solutions that meet the needs of the organization. The ideal candidate will have a proven track record of developing data services and platforms that are robust, scalable, and efficient. In this role, you will be responsible for leveraging your expertise in various GCP services such as BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, and PubSub to create data solutions that drive business insights and decision-making. You will also be expected to work with microservices architectures, utilizing tools like Kubernetes and Docker to ensure that the data services are deployed in a reliable and maintainable manner. A significant aspect of this position involves building and maintaining Symantec layers, as well as architecting and developing both batch and real-time streaming infrastructures. You will need to have solid experience in metadata management, including data catalogues, data lineage, data quality, and data observability, to ensure that the data workflows are efficient and compliant with industry standards. Additionally, the role requires hands-on experience with the GCP ecosystem and data Lakehouse architectures, as well as a strong understanding of DataOps principles and test automation. You will also be responsible for implementing observability tooling such as Grafana and Datadog to monitor and optimize the performance of data services.

Responsibilities

  • Architect and build enterprise-scale data platforms in a green field environment.
  • Develop end-to-end data platforms and data services in the cloud, preferably in GCP.
  • Utilize GCP services such as BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, and PubSub for data solutions.
  • Implement microservices architectures using Kubernetes, Docker, and Cloud Run.
  • Build and maintain Symantec layers for data services.
  • Design and develop batch and real-time streaming infrastructure and workloads.
  • Architect and implement metadata management including data catalogues, data lineage, data quality, and data observability for big data workflows.
  • Ensure adherence to DataOps principles and implement test automation.
  • Utilize observability tooling such as Grafana and Datadog for monitoring and optimization.

Requirements

  • Experience architecting and building successful enterprise-scale data platforms.
  • Proficiency in building end-to-end data platforms and data services in Cloud, preferably in GCP.
  • Hands-on experience with GCP ecosystem and data Lakehouse architectures.
  • Solid experience with architecting and implementing metadata management.
  • Experience with microservices architectures, including Kubernetes and Docker.
  • Proficiency in architecting and designing batch and real-time streaming infrastructure.
  • Excellent experience with DataOps principles and test automation.
  • Strong knowledge of observability tooling such as Grafana and Datadog.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service