Principal Data Engineer

$156,000 - $208,000/Yr

WEX - Chicago, IL

posted 7 days ago

Full-time - Senior
Chicago, IL
Credit Intermediation and Related Activities

About the position

We are looking for a highly motivated and highly potential Senior Staff Engineer to join our Data team to make big business impacts and grow your career. This is an exciting time to be part of the Data team at WEX. WEX offers sophisticated business solutions that empower diverse customers. The data generated from these systems, applications, and platforms is rich and complex. As one of the most valuable assets of WEX, this data holds immense potential to drive value for our customers and the business. The Data team's mission is to build big data technologies, platforms, systems, and tools that clean, process, enrich, and optimize core company data, making it easy and efficient to use. This enables both our customers and internal teams to unlock business value. We also create value-added data products for WEX customers. Leveraging modern big data and AI technologies, we employ agile development practices, a combined engineering approach, and the product operating model to drive innovation and efficiency. We provide challenging problems that have significant business impact, offering you opportunities to learn and grow. Our team consists of highly skilled engineers and leaders who will support, guide, and coach you throughout your journey.

Responsibilities

  • Collaborate with partners and stakeholders to understand customers' business challenges and key requirements.
  • Design, test, code, and instrument complex data products, systems, platforms, and pipelines.
  • Utilize data to drive decisions by effectively measuring and analyzing outcomes.
  • Develop and maintain CI/CD automation using tools like GitHub Actions.
  • Implement Infrastructure as Code (IaC) using tools like Terraform.
  • Apply software development methodologies such as TDD, BDD, and Microservice/Event-Oriented Architectures.
  • Support live data products, systems, and platforms by promoting proactive monitoring.
  • Analyze data, systems, and processes independently to identify bottlenecks and opportunities for improvement.
  • Mentor peers and foster continuous learning of new technologies within the team.
  • Attract top industry talent; contribute to interviews and provide timely, high-quality feedback.
  • Serve as a role model by adhering to team processes and best practices.
  • Collaborate with or lead peers in completing complex tasks.
  • Lead a Scrum team with hands-on involvement.
  • Own large, complex systems, platforms, and products.
  • Lead and actively participate in technical discussions.
  • Design and build high-performance, reliable systems.
  • Complete large, complex tasks independently.
  • Proactively identify and communicate project dependencies.
  • Review peer work, providing constructive feedback.
  • Build scalable, secure, and high-quality big data platforms and tools.
  • Design and build efficient systems, platforms, pipelines, and tools for the entire data lifecycle.
  • Develop data quality measurement and monitoring techniques.
  • Use data modeling techniques to design and implement efficient data models.
  • Become a deep subject matter expert in your functional area.
  • Apply creative problem-solving techniques to assess unique circumstances.
  • Leverage data and AI technologies to enhance productivity.
  • Lead team initiatives by applying your extensive experience and technical expertise.
  • Hold yourself and your team accountable for delivering high-quality results.
  • Provide strategic advice to senior leadership on highly complex situations.
  • Offer thought leadership on business initiatives.

Requirements

  • Bachelor's degree in Computer Science, Software Engineering, or a related field, OR demonstrable equivalent deep understanding, experience, and capability.
  • 10+ years of experience in large-scale software engineering.
  • Strong problem-solving skills, with excellent communication and collaboration abilities.
  • Highly self-motivated and eager to learn.
  • Extensive experience in architecture design.
  • Deep expertise in CI/CD automation.
  • Rich experience in combined engineering practices and Agile development.
  • Extensive experience and strong implementation skills in programming languages such as Java, C#, Golang, and Python.
  • Expertise in data processing techniques, including data pipeline/platform development, SQL, and database management.
  • Extensive experience in data ingestion, cleaning, processing, enrichment, storage, and serving.
  • Experience with cloud technologies, including AWS and Azure.
  • Strong understanding of data warehousing and dimensional modeling techniques.
  • Understanding of data governance principles.

Nice-to-haves

  • Proven expertise in designing and implementing scalable, reliable, and cost-effective data architectures.
  • Extensive experience building and optimizing high-throughput data ingestion frameworks.
  • Hands-on experience with AWS, Azure, or GCP managed services for data storage, compute, and orchestration.
  • Expertise in efficient data modeling and schema design for analytical and transactional data.
  • Deep knowledge of event-driven and streaming architectures.

Benefits

  • Health, dental and vision insurances
  • Retirement savings plan
  • Paid time off
  • Health savings account
  • Flexible spending accounts
  • Life insurance
  • Disability insurance
  • Tuition reimbursement
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service