Xoriant - Sunnyvale, CA

posted 4 days ago

Full-time
Sunnyvale, CA
Professional, Scientific, and Technical Services

About the position

As a Data Engineer, you will be responsible for designing, building, maintaining, monitoring, and troubleshooting large-scale data pipelines. This role involves owning a mix of batch-job and event-driven data processing applications with strict SLAs around processing latencies and accuracy. You will collaborate with Data Scientists, Product Managers, Machine Learning Engineers, and Platform/Service Engineers to create robust, fault-tolerant data processing applications, including datalakes and warehouses.

Responsibilities

  • Design and build large-scale data pipelines and storage solutions using Python, Pyspark, Databricks, and Google/Azure clouds.
  • Manage and design data warehouses and data lakes such as Big Query, GCS, and Delta lake.
  • Utilize reporting/dashboarding tools like Google Looker, Tableau, and MS Power BI.
  • Implement data-quality and pipeline-health monitoring solutions using tools like Prometheus, Grafana, and Splunk.
  • Apply DevOps and MLOps practices in data engineering tasks.
  • Communicate complex technical concepts effectively to non-technical stakeholders.

Requirements

  • Bachelor's degree in Computer Science or equivalent.
  • Strong expertise in data analysis, data visualization, and machine learning algorithms.
  • Experience with frameworks like Apache Airflow and Apache Kafka/Streams.
  • Familiarity with automated ML frameworks like Vertex.ai or Element.
  • Solid understanding of data engineering, data analysis, and data visualization disciplines.

Nice-to-haves

  • Experience with data-quality and pipeline-health monitoring and alerting solutions.
  • Good understanding of DevOps and MLOps practices.

Benefits

  • Work from Sunnyvale or Bentonville offices at least 2 days a week.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service