Pyramid Technology Solutions - Los Angeles, CA

posted 2 months ago

Full-time - Senior
Los Angeles, CA
Professional, Scientific, and Technical Services

About the position

The Senior Data Engineer will play a pivotal role in designing, implementing, and maintaining data solutions that support the organization's data strategy. This position requires a deep understanding of the Azure ecosystem, including Azure Data Factory, Data Lake Storage, and Blob Storage, as well as proficiency in big data technologies such as Databricks and Spark. The ideal candidate will have extensive experience in data engineering fundamentals, including ETL/ELT processes, data pipelines, and data modeling. The role also involves working with various programming languages, primarily Python and SQL, and a solid understanding of data warehousing and business intelligence tools like Power BI and Tableau. In this role, the Senior Data Engineer will be responsible for developing and optimizing data models and pipelines, ensuring data quality and governance, and implementing best practices for data security and compliance. The candidate will also be expected to leverage their knowledge of API management, particularly with tools like Apigee, to enhance data accessibility and integration across platforms. The position requires a strong focus on automation and efficiency, utilizing DataOps principles to streamline data resource management in Azure. The Senior Data Engineer will collaborate with cross-functional teams to ensure that data solutions align with business objectives and support decision-making processes. This role is critical in driving the organization's data initiatives and requires a proactive approach to problem-solving and innovation in data engineering practices.

Responsibilities

  • Design and implement data solutions using Azure Data Factory and Databricks.
  • Develop and optimize ETL/ELT processes and data pipelines.
  • Create and maintain data models and schema designs for data warehousing.
  • Ensure data quality, governance, and compliance with security standards.
  • Collaborate with cross-functional teams to align data solutions with business objectives.
  • Automate Azure data resources using DataOps principles and tools.
  • Implement API management solutions to enhance data accessibility.
  • Monitor and audit data processes to ensure compliance and security.

Requirements

  • A minimum of seven (7) years of experience applying Enterprise Architecture principles.
  • At least five (5) years in a lead capacity within data engineering.
  • Five (5) years of hands-on experience with Azure Data Factory and Databricks.
  • Proven experience in automating Azure data resources using DataOps principles.
  • Five (5) years of experience developing data models and pipelines using Python.
  • Five (5) years of experience working with Lakehouse platforms.
  • Three (5) years of experience in CI/CD pipelines and infrastructure automation using Terraform.
  • Five (5) years of experience with data warehousing systems and integration of BI tools.

Nice-to-haves

  • Knowledge of additional programming languages such as Scala or R.
  • Experience with performance tuning of OLAP/OLTP systems.
  • Familiarity with BI tools beyond Power BI and Tableau.
  • Experience with audit and monitoring in cloud environments.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service