Murtech Consulting - Pittsburgh, PA

posted 2 months ago

Full-time - Mid Level
Remote - Pittsburgh, PA
Professional, Scientific, and Technical Services

About the position

The Developer - Databricks and Google Cloud Platform position is a remote role that requires collaboration with architects, developers, analysts, and non-technical end-users to design and engineer solutions related to operational data. The successful candidate will work closely with team members to understand business requirements that inform solution analyses and designs. This role involves assisting in the construction of these solutions, which includes designing frameworks to load and transform raw source data into enhanced data assets. The developer will be responsible for coding data ingestion, transformation, and delivery programs and logic, ensuring that the solutions meet the specified requirements. In this role, the developer will be expected to develop code that encodes the business logic as part of the ETL (Extract, Transform, Load) process. This includes performing unit testing on the developed code, deploying code to testing environments, and assisting with end-to-end testing to ensure the solutions function as intended. Familiarity with Google Cloud Platform architecture and services, as well as Databricks, is essential for success in this position. The developer will also need to optimize and troubleshoot Databricks workflows, ensuring efficient and effective data processing. The ideal candidate will possess a strong background in information technology or computer science, with a bachelor's degree in a related field. They should have 3-5 years of experience as a developer, specifically with Databricks technology, and be comfortable working with Databricks notebooks, clusters, and jobs. Additionally, experience in the healthcare or health insurance industry is preferred, as well as familiarity with the enterprise data platform used by Highmark enterprise. Excellent written and verbal communication skills are crucial, as the developer must be able to work independently and collaboratively within a team environment.

Responsibilities

  • Collaborate with architects, developers, and analysts to design and engineer solutions for operational data.
  • Understand business requirements that drive solution analyses and designs.
  • Assist in building solutions and designing frameworks for data loading and transformation.
  • Code data ingestion, transformation, and delivery programs and logic.
  • Develop code to encode business logic as part of the ETL process.
  • Perform unit testing on developed code.
  • Deploy code to testing environments and assist with end-to-end testing.

Requirements

  • Bachelor's degree in Information Technology, Computer Science, or a closely related field.
  • 3-5 years of experience as a developer.
  • 3-5 years of experience in Databricks technology, including working with Databricks notebooks, clusters, and jobs.
  • Expertise with Google Cloud Platform.
  • Strong problem-solving, root cause analysis, and issue resolution skills.
  • Experience in the Healthcare or Health Insurance Industry.
  • Excellent written and verbal communication skills.
  • Ability to work independently and as part of a team.
  • Experience with the enterprise data platform is preferred.

Nice-to-haves

  • Experience with application development tools.
  • Familiarity with software development life cycle used by Highmark enterprise.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service