CloudTrucksposted about 1 month ago
Senior
San Francisco, CA

About the position

The trucking industry is the backbone of the global economy. More than 70% of what we consume in the U.S. is moved by trucks. Those trucks are powered by over 3.5 million drivers per year and create over $700B in annual revenue. Trucking is a massive industry but it is a traditional industry and like many traditional industries, it is ripe for innovation. CloudTrucks is building the operating system for trucking and is the first platform specifically designed to empower truck drivers. Our all-in-one, 'business in a box' solution optimizes and automates operations and accelerates cash-flow for drivers, so they can focus on building their business. The Data Analytics team at CloudTrucks owns all business reporting end-to-end, and develops both internal and customer-facing, data analytics products in collaboration with Product development teams. As a Senior Data Analytics Engineer, you’ll be responsible for maintaining and scaling our Data Infrastructure. You’ll also have the opportunity to collaborate with teams across the company, and support them with their data needs, from ingesting new data sources, to help them design the proper data architecture for more intricate features and reporting.

Responsibilities

  • Build, audit, and evolve data ingestion processes, always with performance and scalability in mind - we use a mix of Google Cloud Services, Airflow and Segment
  • Evolve and scale our data warehouse
  • Add additional data, maintain and organize our data warehouse
  • Apply engineering best practices to our data transformation layer - we use Dataform from Google Cloud Services
  • Improve the efficiency of our most demanding transformation queries with performant SQL code
  • Enable operational analytics by syncing data to 3rd party tools, 'closing the loop' in data circulation
  • Be the keystone for self-service analytics and data visualization
  • Manage data visualization in Looker; build and own mission critical dashboards
  • Support the organization to answer questions with data through training, tooling, process and your ingenuity
  • Collaborate across the company to ensure the right data is available for all projects
  • Define, drive and own service level agreements for customer facing, as well as internal, data analytics products
  • Champion data best practices across engineering, especially around efficiency, coding standards, data observability, data security and operations
  • Own the Data Infrastructure roadmap, and work with the Head of Data Analytics to define the strategy for the data warehouse and data infrastructure
  • Collaborate with software engineers on data needs for Machine Learning pipelines

Requirements

  • 5+ years of experience working with data warehouses: building, monitoring, maintaining and scaling ETL pipelines, with a focus on data quality, integrity and security
  • Expertise in software engineering principles - version control, code reviews, testing, CI - as well as git and command line interfaces
  • Expertise in writing complex, efficient and DRY SQL code, as well as handling large data sets, preferably in Python, and identifying and resolving bottlenecks in production systems
  • Understanding of data engineering architectures, tools and resources - databases, computation engines, stream processors, workflow orchestrators and serialization formats - especially cloud hosted and managed versions
  • An efficient, customer-focused approach to development, pursuing pragmatic solutions to deliver the best results
  • Expertise with managing analytics, data engineering & visualization tools, Looker is preferred
  • Strong experience with GCP (BigQuery, Dataform) as well as with Airflow or other orchestration tooling
  • Strong analytical skills with an ability to work both in with structured and unstructured datasets. Ability to perform data extraction, cleaning, analysis and presentation of insights both to technical and non-technical stakeholders
  • Demonstrated ability to translate business requirements into technical solutions and actionable insights, while leveraging project management tools to successfully organize work and deliver results
  • Strong written, and verbal communication skills are paramount for this role
  • Comfortable working in the dynamic, collaborative environment of fast growth startup

Nice-to-haves

  • Experience with Python or R
  • Experience with Salesforce architecture
  • Experience working in Freight Operations or Logistics
  • Experience working at high-growth startups. Preference for experience in consumer tech, marketplace, or SaaS industries

Job Keywords

Hard Skills
  • BigQuery
  • Python
  • R
  • Segment
  • SQL
  • 5SwrKhDti eDqbCRph4O
  • 7Q3A2 eRgJrM2FVQmhNK
  • 8NdVI 2YfSU7Lrdb
  • AeImFS0 93Jw2
  • CafzTowSP x7ILNtZ4W1lM
  • ce4Eq RWPYrNeLq
  • CwLX3jz8 iM9kRxjBHn
  • eIz6FN a4fK06g2p
  • EUx3v hRABf390KxvWmuC
  • ftbDkp hsDw2UI0e
  • gsJDfh9rKTY vtwk1JBbyR
  • iEIfm25wgsr
  • IyE7V XVapuSGsWbklr3H
  • juU20S JkjbaODl
  • JxQNeZPY acmepA
  • KTliRJ9 43CcyuYRJD5Hb7s
  • LQsY6j 7FLkBt6WU
  • lunBtPqb 8KDMvGeqX
  • m27b8 LXuFYUQ4Sy
  • m7CEw 0DKclRSmX8Lz
  • NRbguHE4XtlSoi qE6LkGvnOWP3
  • nyYW g5jPum OaBxVr6H yt1zRTpYfnQg
  • OEman LSjqv7RO
  • p6iP3sbco LkVAPpO3W7Gjm
  • rZ3Rf9og L6tJWpTd
  • sKIRy8i52 BFnZAEb0xCpT
  • sLU64MtDAg7Q YKNisco63w
  • u2P7vRJn
  • uv6Zl Z8eFmkp6ISwME
  • VGvuF o2KkuAsbPMGVwJ
  • vO3Tj9p ELmgldKGit53nc osFmgZk
  • w8KGt O8YzbA3j7ut
  • wfkyg8Dv nhWiwo69
  • WNeFr sKbY8J45TSC
  • wT0nE PeYs13
  • xbutJ TL2vVeuIZaKGfry
  • YMeXEopj
  • YWOdE7j3a ArGkqQPjCh
  • Z6OX8 FWDitS2jNs
  • ZrA9vfaMN SHhUrcu9wCnT
Soft Skills
  • DLG26Roypbi 4c8daKD
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service