McKesson - Columbus, OH

posted 3 months ago

Full-time - Mid Level
Columbus, OH
1-10 employees
Merchant Wholesalers, Nondurable Goods

About the position

McKesson is a Fortune 10 company that plays a crucial role in the healthcare sector, focusing on making quality care more accessible and affordable. The company is dedicated to the health, happiness, and well-being of its employees and the communities it serves. The Software Engineer (ETL / Databricks) position is part of the Business Operations Support data engineering team, which is responsible for planning, designing, troubleshooting, and documenting technical requirements for data flows between various operational systems and the data warehouse. This role is essential in developing and maintaining data pipelines that facilitate the ingestion and automation of files provided by customers, utilizing ETL tools and Databricks. The ideal candidate will have a strong background in data engineering, particularly with Databricks and ETL tools such as Talend, Informatica, SSIS, or DataStage. They will also possess experience in SQL and general-purpose programming languages like Python or Java. The role involves end-to-end development of ETL processes, data analytics, and collaboration with other teams to understand business needs and propose innovative solutions. The position is hybrid, requiring the candidate to reside in the Columbus, OH area, with the majority of work being remote and occasional in-office presence required. McKesson is committed to fostering a culture of growth and empowerment, encouraging employees to bring new ideas and make a significant impact in the healthcare industry.

Responsibilities

  • Build data pipelines to ingest and automate files provided by customers using ETL tools.
  • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using primarily ETL tools.
  • Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency.
  • Investigate and resolve data-related issues and provide support and troubleshooting expertise.
  • Collaborate across teams to understand business needs and propose innovative solutions.

Requirements

  • Bachelor's degree in Computer Science or related technical degree, or equivalent experience, and 2+ years of relative experience.
  • 2+ years of experience with Databricks and building data pipelines to ingest and automate files provided by customers.
  • 2+ years of experience with SQL queries, including various SQL commands.
  • Experience with Python.
  • Knowledge of Structured and Unstructured data.
  • Possess understanding of BI concepts and be familiar with relational or multi-dimensional modeling concepts.
  • Understanding of RDBMS best practices and performance tuning techniques.
  • Experience with cloud technologies such as AWS services such as S3, CloudWatch, EC2, and passion for a role working in a cloud data warehouse.

Nice-to-haves

  • Experience with Agile and Scrum methodologies.
  • Knowledge of Java or JavaScript.

Benefits

  • Competitive compensation package including base pay and potential bonuses.
  • Opportunities for long-term incentives based on performance and skills.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service