EdgeAll - San Jose, CA

posted 11 days ago

Full-time - Mid Level
San Jose, CA

About the position

As an Azure Data Engineer, you will be responsible for designing, building, maintaining, monitoring, and troubleshooting large-scale data pipelines. This role involves owning a mix of batch-job and event-driven data processing applications, ensuring strict adherence to SLAs regarding processing latencies and accuracy. You will collaborate with Data Scientists, Product Managers, Machine Learning Engineers, and Platform/Service Engineers to create robust, fault-tolerant data processing applications, including data lakes and warehouses.

Responsibilities

  • Design and build large-scale data pipelines and storage solutions using Python, Pyspark, and Databricks.
  • Manage data warehouses and data lakes such as Big Query, GCS, and Delta Lake.
  • Utilize reporting/dashboarding tools like Google Looker, Tableau, and MS Power BI.
  • Implement data-quality and pipeline-health monitoring solutions using tools like Prometheus, Grafana, and Splunk.
  • Apply DevOps and MLOps practices in data engineering tasks.
  • Collaborate with cross-functional teams to deliver data solutions.

Requirements

  • Bachelor's degree in Computer Science or equivalent.
  • Strong expertise in data analysis, data visualization, and machine learning algorithms.
  • Experience with frameworks like Apache Airflow and Apache Kafka/Streams.
  • Solid understanding of data engineering, data analysis, and data visualization disciplines.

Nice-to-haves

  • Familiarity with automated ML frameworks like Vertex.ai or Element.
  • Strong communication skills to convey complex technical concepts to non-technical stakeholders.

Benefits

  • Work from Sunnyvale or Bentonville offices at least 2 days a week.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service