Veritex Bank - Dallas, TX

posted 3 months ago

Full-time - Mid Level
Dallas, TX
1-10 employees

About the position

As a Database Administrator/Engineer, you will be responsible for the comprehensive development of all aspects of data processing, which includes database management, architecture/modeling, and ETL (Extract, Transform, Load) processing. Your role will involve creating and delivering analytical solutions using various tools, while collaborating closely with the corporate data analytics team and business stakeholders to gather requirements and translate them into technical specifications and process documentation. You will work on the architecture and development of an event-driven data warehouse, focusing on streaming, batch processing, data modeling, and storage solutions. Additionally, you will provide technical guidance, review code and test results, and oversee the production development process by reviewing pull requests and scripts. In this position, you will engage with advanced databases, writing and optimizing SQL queries, stored procedures, functions, and managing data partitioning and indexing. You will also be tasked with writing and debugging Python/PySpark scripts to generate data extracts, clean and conform data, and deliver it for consumption. Implementing ETL architecture, conducting data profiling, and establishing process flows, metric logic, and error handling will be key components of your responsibilities. You will support continuous improvement initiatives by exploring and presenting alternatives to existing processes and technologies, ensuring that the data processing environment remains efficient and effective.

Responsibilities

  • Enable the design of conceptual architecture and technical solutions.
  • Design, develop and maintain scalable and efficient data pipelines and workflows using Azure Synapse or similar tools to support various business use cases and analytical requirements.
  • Collaborate closely with cross-functional teams to understand data requirements and translate them into technical solutions.
  • Implement and manage CI/CD pipelines using Azure DevOps to automate the deployment, testing, and monitoring of data engineering solutions, ensuring continuous integration and delivery.
  • Optimize and tune data pipelines and process workflows for performance, scalability, and reliability.
  • Collaborate with data architects, engineers, and analysts to design, develop, and maintain data pipelines and ETL processes.
  • Leverage best practices and monitoring tools to identify and address bottlenecks and inefficiencies.
  • Utilize Azure Data Studio to design and optimize database schemas, stored procedures, and SQL queries.
  • Ensure best practices in data integrity, efficiency, and security.
  • Monitor and troubleshoot data pipeline health and performance issues, proactively identifying and resolving issues to minimize downtime and ensure data quality and reliability.
  • Create and maintain comprehensive documentation for data pipelines, workflows, and system configurations, ensuring knowledge transfer and compliance with organizational standards and policies.
  • Conduct data profiling, cleansing, and validation activities to ensure data quality and integrity throughout the data life cycle, adhering to data governance and compliance requirements.
  • Participate in code review, testing, and debugging activities to ensure the reliability, stability, and maintainability of data engineering solutions.
  • Provide technical guidance, mentorship, and support to team members.
  • Troubleshoot and resolve data-related issues reported by end-users or detected through monitoring and alerting systems, ensuring timely resolution and minimal disruption to operations.
  • Design and implement a mature data lifecycle, including backups and restore of data.
  • Assist ServiceNow admins with creating workflows relevant to the data lifecycle.
  • Ensure compliance with Role-Based Access and other regulatory requirements.
  • Develop and implement data governance policies and procedures to ensure data accuracy, completeness, and consistency.
  • Ensure data quality and integrity by implementing data validation, cleansing, and testing methods.
  • Troubleshoot and resolve data issues and provide technical support as needed.

Requirements

  • 7 years or more experience required.
  • Bachelor's degree required.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service