Rigil Corporation - San Francisco, CA

posted 9 days ago

Full-time - Mid Level
San Francisco, CA
Professional, Scientific, and Technical Services

About the position

The Data Engineer role at Rigil is crucial for the evolution of the Vision platform, which serves as the operational core of the company's technology consulting and product development efforts. This position focuses on enhancing the data and AI platform to ensure it is scalable, intelligent, and secure, providing reliable access to data for customers. The Data Engineer will work on building new capabilities that incorporate AI and conversational interfaces, improving user experiences and gathering valuable insights about customer interactions and business operations.

Responsibilities

  • Create and maintain optimal data pipeline architecture.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, NoSQL, and AWS 'big data' technologies.
  • Build analytics (AI & ML) tools that utilize the data pipeline to provide actionable insights into customer acquisition, device operations, and other key business performance metrics.
  • Keep our data separated and secure across tenants and geographical boundaries through data segmentation and infrastructure.
  • Manage databases effectively using PostgreSQL, ensuring data integrity and performance.
  • Utilize AWS for continuous deployment and cloud services management.
  • Work within Agile development frameworks, contributing to all phases of the software development lifecycle.
  • Innovate and contribute to a technology roadmap that aligns with our mission of providing sustainable energy solutions.

Requirements

  • Bachelor's degree in computer science or related field; Advanced degree preferred.
  • At least 5 years of data engineering experience.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
  • Experience building and optimizing 'big data' data pipelines, architectures, and data sets.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift.
  • Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.
  • Working knowledge of MLOps.
  • Familiarity with IoT platforms and integrating AI tools in data applications.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service