Python Data Developer

$84,200 - $156,600/Yr

London Stock Exchange - Washington, DC

posted 4 months ago

Part-time - Mid Level
Washington, DC
10,001+ employees
Computing Infrastructure Providers, Data Processing, Web Hosting, and Related Services

About the position

We're building the next generation of capital markets technology. Global capital markets are an ocean of fast-moving, interrelated and complex data. We are looking for passionate and enthusiastic problem-solvers with expertise in capital markets data to join us as we continue providing technology innovation and support to the world's leading global capital markets companies. The Python Data Developer will be responsible for data processing pipelines. We have the mission of "Capturing, Storing and Transforming" the world's market data. Our data team manages the data pipelines that flow between each of those stages. The role has a large operational component while requiring someone to envision future growth in utilization and technology selection for the work's components. In this role you will interface regularly with our client, a prominent US government agency, and play a critical role in shaping and delivering solutions. Your responsibilities will span the entire product lifecycle, from conception to execution, as well as ongoing service and product improvements. You will collaborate closely with cross-functional teams, stakeholders, and end-users to ensure successful product outcomes.

Responsibilities

  • Develop and maintain data processing pipelines for capital markets data.
  • Interface with clients, particularly a prominent US government agency, to deliver solutions.
  • Manage the entire product lifecycle from conception to execution and ongoing improvements.
  • Collaborate with cross-functional teams and stakeholders to ensure successful product outcomes.

Requirements

  • US Citizen eligible for US Government Security Clearance
  • 5+ years of relevant professional experience
  • Significant experience with Python and ability to write and maintain object-oriented Python applications
  • Experience in a batch data processing environment or similar with large data jobs using automation
  • Experience with ETL pipelines and Data Lakes
  • Strong knowledge of AWS, Docker, and Git version control systems (GitLab and/or GitHub)
  • Proficiency with Linux, Bash, and understanding of Linux server architecture and operation
  • Experience running profilers, debuggers, and general troubleshooting/debugging
  • Ability to analyze performance of large and small data queries, jobs, and pipelines to find and fix bottlenecks.

Nice-to-haves

  • Experience with monitoring tools like Nagios, Prometheus, Grafana
  • Experience with Uptime, Jenkins, Airflow, Open Metrics, Kubernetes, DataDog

Benefits

  • Annual Wellness Allowance
  • Paid time-off
  • Medical, Dental, Vision insurance
  • Flex Spending & Health Savings Options
  • Prescription Drug plan
  • 401(K) Savings Plan with Company match
  • Basic life insurance
  • Disability benefits
  • Emergency backup dependent care
  • Adoption assistance
  • Commuter assistance
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service