Hrmango - Salt Lake City, UT

posted 2 months ago

Full-time - Mid Level
Salt Lake City, UT
Professional, Scientific, and Technical Services

About the position

Are you a skilled Database Administrator seeking a rewarding career that makes a meaningful impact? HRmango is seeking a DBA for The State of Utah's Department of Government Operations, Division of Technology Services (DTS), to support the Department of Corrections. This position is a one-year state contract, and only local candidates will be considered. In this role, you will manage SQL databases across Windows and Linux servers, as well as Google Cloud Platform (GCP), with a strong emphasis on BigQuery and GCP services. You will be responsible for establishing, managing, and optimizing our data warehouse infrastructure to enhance business intelligence and analytics. Your contributions will be vital in ensuring our data systems are reliable, scalable, and high-performing. You will work closely with data analysts, engineers, and business stakeholders to deliver actionable insights and support data-driven decisions. Your responsibilities will include designing and implementing BigQuery data warehouses, building and maintaining data pipelines, monitoring and tuning data warehouse performance, and ensuring data security and compliance. You will also develop and enforce data governance policies, document data warehouse architecture, and provide training and support to team members and end-users. Flexibility is key, as you may need to perform tasks both during and after business hours as needed.

Responsibilities

  • Develop, create, and manage BigQuery data warehouses, including designing table structures, writing stored procedures, and optimizing query performance.
  • Build and maintain data pipelines for ingesting, transforming, and loading data from sources like on-premise Informix databases and PostgreSQL from AWS into BigQuery on GCP.
  • Monitor and tune data warehouse performance, including optimizing queries, debugging, and applying indexing and caching strategies.
  • Implement best practices for data security, manage access controls, and ensure compliance with relevant regulations and standards.
  • Set up monitoring and alerting for data warehouse systems, troubleshoot issues, and ensure high availability and reliability.
  • Develop and enforce data governance policies, data quality standards, and best practices.
  • Work closely with data analysts, engineers, and business stakeholders to understand their data needs and deliver effective solutions.
  • Document data warehouse architecture, processes, and best practices. Provide training and support to team members and end-users.
  • Create, manage, and maintain secure database access roles.
  • Perform other duties as assigned.

Requirements

  • Proven experience with BigQuery on Google Cloud Platform (GCP), including designing and optimizing datasets.
  • Extensive knowledge of database design, documentation, and management of computer system databases and data warehouses.
  • Strong skills in SQL for querying, analysis, and performance tuning.
  • Experience with data integration tools (e.g., Dataflow, Apache Beam, Airflow) and ETL processes.
  • Understanding of data security practices, access controls, and compliance requirements (e.g., GDPR, HIPAA).
  • Experience with other data warehouse technologies (e.g., Redshift, Snowflake) and cloud platforms (e.g., AWS, GCP).
  • Knowledge of data modeling, data warehousing concepts, and best practices.
  • Familiarity with data visualization tools (e.g., Looker, PowerBI).
  • Strong analytical, research, and organizational skills with a keen attention to detail.
  • Understanding and experience with cloud services such as GCP and AWS.
  • Desired skills in system administration and CI/CD pipelines.
  • Knowledge of GitHub database version control.
  • Ability to test and troubleshoot using appropriate methodologies and techniques.
  • Excellent written and verbal communication, interpersonal, and organizational skills.
  • Ability to interact with others with sensitivity, tact, and professionalism.
  • Understanding of computer science principles, theories, and practices.

Nice-to-haves

  • Experience with Informix databases is also desired.
  • Familiarity with data visualization tools (e.g., Looker, PowerBI).
  • Knowledge of data integration tools (e.g., Dataflow, Apache Beam, Airflow) and ETL processes.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service