Senior GCP Data Engineer

$180,000 - $180,000/Yr

Roth Staffing Companies - San Diego, CA

posted 8 days ago

Full-time - Mid Level
San Diego, CA
Administrative and Support Services

About the position

This position involves leading and assisting in the development of data automation and infrastructure projects, particularly in a big data commercial setting. The role requires extensive documentation, communication of goals, and support within an agile development team, with a focus on integrating new technologies into existing workflows.

Responsibilities

  • Assist with and/or lead development of data automation & data infrastructure projects.
  • Research, assess, and integrate new technologies for integration into existing workflows.
  • Maintain and improve data pipelines across big data modalities.
  • Extensively document processes and communicate goals clearly.
  • Provide support in an agile development team.

Requirements

  • 5+ years of development experience working in a big data commercial setting, healthcare preferable.
  • Advanced knowledge of Python and experience deploying applications, pipelines, and databases in a cloud environment.
  • Advanced knowledge of SQL and experience with relational databases (BigQuery, Redshift, Postgres, MySQL, etc.) preferably in a cloud setting.
  • Experience working with sensitive data (HIPAA, GxP, etc.).
  • Experience with a production cloud environment (GCP preferred, AWS, Azure).
  • Strong familiarity with Unix/Linux environments.
  • Knowledge of Infrastructure-as-Code (IaC) tools like Terraform, GitHub Actions, Ansible, Kubernetes, etc.
  • Knowledge of serverless development and architecture like Cloud Run, Lambda functions, App Engine, etc.
  • Knowledge of stream processing (Pub/Sub, Apache Kafka) and highly scalable datastores like Cloud Storage and S3.
  • Experience working with cross functional teams in a dynamic startup environment.
  • Strong project management and organizational skills.
  • Bachelor's degree in computer-related field (computer science, etc.), minimum 5+ years working experience.

Nice-to-haves

  • Understanding of Cloud Native Architecture would be a plus.
  • Experience working with big data in an academic setting.
  • Graduate level coursework and/or projects in big data.
  • Experience with common data models like OMOP and DICOM.
  • Experience with de-identification tools like Google DLP, Flywheel, etc.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service