William Blair - Chicago, IL

posted 8 days ago

Full-time - Mid Level
Chicago, IL
Securities, Commodity Contracts, and Other Financial Investments and Related Activities

About the position

The Lead Data Engineer will join a high-performing Business & Technology team at William Blair, working closely with Portfolio Managers, Analysts, and Quantitative Strategists. This role focuses on designing, developing, and supporting scalable systems for data management and analytics, contributing to the firm's competitive advantage in investment management. The position offers opportunities for collaboration with various internal and external partners, as well as exposure to cutting-edge technologies in a dynamic environment.

Responsibilities

  • Design, develop, and support scalable end-to-end systems for data ingestion, curation, modeling, and storage.
  • Develop APIs for proprietary web-based applications.
  • Collaborate with internal teams and external partners on new data initiatives and design data feeds/APIs.
  • Expand access to data in the data lake using tools like Python, R, and Dremio.
  • Assist in migrating the research platform from Linux VMs to containers and Kubernetes.
  • Enhance system reliability and scalability through automation, unit testing, and robust logging.
  • Develop monitoring and logging systems for production container resources using Prometheus and Grafana.
  • Evaluate and implement dbt as part of the technology stack.
  • Support Linux VMs while promoting the use of Docker and serverless cloud-based compute.
  • Wrangle alternative and unstructured data for insights extraction.
  • Reduce duplication and redundancy in data infrastructure.
  • Learn and evaluate new technologies to improve data infrastructure scalability.
  • Design and execute pilots or proof of concepts for new technologies.
  • Follow best practices for data management and security protocols.
  • Provide technical guidance for high reliability and secure systems.
  • Assist in other Investment Management projects as needed.

Requirements

  • Bachelor's/master's degree in computer science or related quantitative field, or equivalent experience.
  • 5 to 10 years of programming or data engineering experience.
  • Proficiency in .NET, C#, and RESTful API development.
  • Strong SQL programming skills; Python and R are a plus.
  • Experience with Kubernetes, Docker, dbt, Presto/Trino, Spark/Databricks, Kafka, NoSQL, Prometheus, and Grafana is advantageous.
  • Experience in data modeling and normalization.
  • Exposure to Azure or AWS cloud environments.
  • DevOps experience including Git and CI/CD pipelines; Azure DevOps is a plus.
  • Systems administration skills in Linux, including shell scripting.
  • Strong problem-solving skills and persistence in troubleshooting complex systems.
  • Self-motivated with high initiative and enthusiasm.
  • Strong verbal and written communication skills.

Nice-to-haves

  • Experience with data wrangling and extraction of insights from unstructured data.
  • Familiarity with cloud-based technologies and services.

Benefits

  • Competitive salary and performance bonuses.
  • Comprehensive health insurance coverage.
  • 401(k) retirement savings plan with company matching.
  • Flexible work hours and remote work options.
  • Professional development opportunities and continued education support.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service