Saksoft - Phoenix, AZ

posted 20 days ago

Full-time
Phoenix, AZ
Professional, Scientific, and Technical Services

About the position

The Big Data Developer position focuses on utilizing Google Cloud Platform (GCP) technologies to develop and manage big data solutions. The role requires proficiency in tools such as PySpark, BigQuery, and Dataproc, along with a solid understanding of Airflow for orchestration and SQL for data manipulation. This position is essential for organizations looking to leverage big data analytics to drive business insights and decision-making.

Responsibilities

  • Develop and manage big data solutions using Google Cloud Platform.
  • Utilize PySpark for data processing and transformation.
  • Implement data workflows using Airflow on GCP.
  • Work with BigQuery for data analysis and querying.
  • Manage data processing jobs on Dataproc.

Requirements

  • Proficient in PySpark for big data processing.
  • Strong knowledge of BigQuery for data analysis.
  • Experience with Dataproc for managing data processing jobs.
  • Good understanding of Airflow for workflow orchestration.
  • Familiarity with SQL for data manipulation.
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service