Snowflake Computing - San Francisco, CA

posted 4 days ago

Full-time - Senior
San Francisco, CA
Computing Infrastructure Providers, Data Processing, Web Hosting, and Related Services

About the position

The Data Platform Architect role at Snowflake involves providing leadership in the design and architecture of the Snowflake Cloud Data Platform. This position requires collaboration with sales teams, product management, and technology partners to leverage expertise in data architecture and best practices, ensuring customer success and operational efficiency. The role is pivotal in influencing product roadmaps and creating reference architectures while engaging with both technical and business executives.

Responsibilities

  • Apply multi-cloud data architecture expertise while presenting Snowflake technology to executives and technical contributors.
  • Partner with sales teams and channel partners to understand customer needs and strategize on winning sales cycles.
  • Provide value-based enterprise architecture deliverables and support strategic enterprise pilots and proof-of-concepts.
  • Collaborate with the Product team to influence Cloud Data Platform product roadmaps based on feedback.
  • Work with Product Marketing teams to build awareness and support pipeline building through various channels.
  • Contribute to the creation of reference architectures, blueprints, and best practices based on field experience.

Requirements

  • 10+ years of architecture and data engineering experience within the Enterprise Data space.
  • Deep technical hands-on expertise in Data Warehouse Modernization/Migrations, Data Lakes, and Data Engineering.
  • Development experience with SQL, Python, Pandas, Spark, PySpark, Hadoop, Hive, and other Big Data technologies.
  • Prior knowledge of Data Engineering tools for ingestion, transformation, and curation.
  • Familiarity with real-time use cases and technologies like Kafka and Flink.
  • Understanding of integration services and tools for building ETL and ELT data pipelines.
  • 3+ years of Cloud Provider experience with certifications in AWS, GCP, and/or Azure.
  • Working knowledge of known table formats like Hudi and Iceberg.
  • Hands-on experience with database change management and DevOps processes.
  • Strong architectural expertise in data engineering.

Nice-to-haves

  • Experience with at least one Big Data project for data lake or data engineering.
  • Master's Degree in computer science, engineering, mathematics, or related fields.

Benefits

  • Competitive salary and performance bonuses.
  • Health insurance coverage.
  • 401k retirement savings plan with matching contributions.
  • Flexible work hours and remote work options.
  • Professional development opportunities and continued education support.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service