Senior Data Engineer (# 364165)

$113,000 - $123,000/Yr

University of Minnesota

posted 2 months ago

Full-time - Mid Level
Educational Services

About the position

The Senior Data Engineer position at the University of Minnesota is a critical role within the Office of Information Technology (OIT), responsible for developing and maintaining mission-critical applications. The successful candidate will work closely with the Data Platform and Cloud Enablement teams to implement a multi-container Oracle Cloud Infrastructure (OCI) environment, supporting both Oracle GoldenGate and Oracle Analytics Cloud implementations. This role is ideal for an experienced Oracle Cloud Infrastructure Engineer who has a strong background in establishing OCI architecture and is passionate about enhancing the University's data architecture to support innovative products and data initiatives. In this position, the Senior Data Engineer will focus primarily on Data Pipeline Architecture and Development, dedicating approximately 80% of their time to this area. Responsibilities include supporting change data capture techniques on the PeopleSoft Oracle database to facilitate the flow of ERP data into downstream systems, creating Azure Data Pipelines using the Microsoft Azure toolkit, and implementing data structures on both Oracle and Microsoft Azure platforms. The engineer will also be responsible for maintaining data security activities, including role and account management, and conducting regular access reviews. Additionally, the role involves designing, developing, and deploying new APIs, event streams, and file-based integrations using various data integration technologies such as Dell Boomi, Oracle SQL, Apache Kafka, and Amazon SQS. The remaining 20% of the role will involve project leadership, where the Senior Data Engineer will mentor team members throughout the application development lifecycle and provide subject matter expertise on Change Data Capture and Databricks implementation practices. This position is not eligible for H-1B or Green Card sponsorship, making it essential for candidates to have the necessary work authorization.

Responsibilities

  • Support and leverage change data capture techniques on the PeopleSoft Oracle database to feed ERP data into downstream systems.
  • Create Azure Data Pipelines using the Microsoft Azure toolkit.
  • Model system agnostic data structures based to abstract away PeopleSoft specific source data structures.
  • Implement data structures (SQL and no-SQL) on Oracle and Microsoft Azure data platforms.
  • Implement and maintain data security activities including role, account, and secret management as well as regular access review.
  • Design, develop and deploy new APIs, event streams, and file based integrations using Dell Boomi, Oracle SQL, Apache Kafka, Amazon SQS, and other data integration technologies.
  • Monitor and maintain production integrations and data flows.
  • Mentor members of the team, as necessary, working on all phases of the application development lifecycle.
  • Provide Subject Matter Expertise on Change Data Capture and Databricks implementation practices.

Requirements

  • BA/BS plus at least 4 years of experience, or master's degree plus 2 years of experience.
  • Experience with the Microsoft Azure platform.
  • Experience with Databricks.
  • Experience building and optimizing 'big data' data pipelines, architectures, and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

Nice-to-haves

  • Familiarity with Higher Education.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools.
  • Experience with object-oriented/object function scripting languages.
  • Strong analytic skills related to working with structured datasets.

Benefits

  • Competitive wages, paid holidays, and generous time off
  • Continuous learning opportunities through professional training and degree-seeking programs supported by the Regents Tuition Benefit Program
  • Low-cost medical, dental, and pharmacy plans
  • Healthcare and dependent care flexible spending accounts
  • University HSA contributions
  • Disability and employer-paid life insurance
  • Employee wellbeing program
  • Excellent retirement plans with employer contribution
  • Public Service Loan Forgiveness (PSLF) opportunity
  • Financial counseling services
  • Employee Assistance Program with eight sessions of counseling at no cost
  • Employee Transit Pass with free or reduced rates in the Twin Cities metro area
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service