Sonatus - Sunnyvale, CA

posted 1 day ago

Full-time - Senior
Sunnyvale, CA
Computing Infrastructure Providers, Data Processing, Web Hosting, and Related Services

About the position

Sonatus is a well-funded, fast-paced, and fast-growing company whose technologies and software help automakers build dynamic software-defined vehicles. With two generations of solutions already on the road with a top global OEM, our vehicle and cloud software solutions are at the forefront of the digital transformation of vehicles. The Sonatus team is a talented and diverse collection of technology and automotive specialists hailing from many of the most prominent companies in their respective industries. We are seeking a highly skilled and experienced Senior Staff Data Analytics Platform Engineer to lead the design, development, and optimization of our data analytics platform. This role requires a deep understanding of scalable data systems, cloud-native architectures, and advanced data processing frameworks. You will play a key role in shaping the future of our data infrastructure, enabling our organization to derive valuable insights from vast amounts of data. This position offers the opportunity to work on cutting-edge technologies and drive innovation in our data engineering practices.

Responsibilities

  • Lead the design and implementation of a robust, scalable, and high-performance data analytics platform capable of handling large-scale data workloads.
  • Architect and optimize data pipelines for both real-time and batch processing, ensuring data is delivered accurately and efficiently across the organization.
  • Evaluate and select appropriate technologies and frameworks (e.g., Spark, Flink, Kafka, Presto) to build a modern data analytics platform that meets business needs.
  • Identify and address performance bottlenecks in the data platform, ensuring the system can scale to meet the growing data demands of the business.
  • Implement data partitioning, sharding, and indexing strategies to optimize query performance and reduce latency.
  • Ensure the platform is designed for high availability, reliability, and fault tolerance, minimizing downtime and ensuring business continuity.
  • Collaborate with software engineering, data engineering, and analytics teams to integrate data sources into the platform and ensure data quality and consistency.
  • Develop and enforce best practices for data governance, data security, and data management within the platform.
  • Implement data cataloging and metadata management tools to improve data discoverability and usability across the organization.
  • Lead cross-functional teams to deliver data-driven solutions that support business objectives and drive strategic initiatives.
  • Provide technical leadership and mentorship to junior engineers, fostering a culture of continuous learning and improvement.
  • Collaborate with product managers, data scientists, and other stakeholders to understand data requirements and translate them into scalable solutions.
  • Stay up to date with the latest advancements in data engineering, analytics, and cloud technologies, and apply them to enhance the platform.
  • Drive continuous improvement initiatives to streamline data processing workflows, reduce costs, and improve the overall efficiency of the data platform.
  • Contribute to the development of internal tools and frameworks that enhance the capabilities and performance of the data analytics platform.

Requirements

  • Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
  • 10+ years of experience in data engineering, data platform engineering, or a similar role, with a strong focus on building and scaling data analytics platforms.
  • Proven experience with big data technologies (e.g., Apache Spark, Apache Flink, Apache Kafka) and cloud-based data platforms (e.g., AWS, Google Cloud, Azure).
  • Expertise in designing and managing large-scale data pipelines and distributed data processing systems.
  • Strong knowledge of SQL and NoSQL databases, data lake solutions (e.g., Apache Iceberg, Big Query), and ETL processes.
  • Proficiency in programming languages such as Go, Java, or Scala for building data applications and automating workflows.
  • Experience with containerization and orchestration tools (e.g., Docker, Kubernetes) for deploying and managing data services.
  • Excellent problem-solving and analytical skills, with a keen attention to detail.
  • Strong communication skills, with the ability to articulate complex technical concepts to both technical and non-technical stakeholders.
  • Leadership abilities, with experience mentoring and guiding engineering teams.

Nice-to-haves

  • Experience with real-time data streaming technologies and frameworks.
  • Knowledge of data governance, data quality, and master data management best practices.
  • Familiarity with machine learning pipelines and integration with data analytics platforms.

Benefits

  • Competitive compensation and equity program
  • Health care plan (Medical, Dental & Vision)
  • Flexible and Dependent Care Expense program
  • Retirement plan (401k)
  • Life Insurance (Basic, Voluntary & AD&D)
  • Unlimited paid time off per year
  • Hybrid office work-arrangement/flexibility
  • Complimentary lunches, snacks and beverages during on-site working days
  • Wellness benefit allowances (towards gym membership and fitness programs)
  • Internet reimbursement
  • Computer Accessory Allowance
  • Departmental team building and outings
  • Employee Referral Program
  • Culture/Employee Satisfaction Surveys - Feedback matters!
  • Peer Award Program (monthly)
  • Innovative Award Program (annual)
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service