Church & Dwight Coposted 1 day ago
Hybrid • Ewing, NJ
Chemical Manufacturing

About the position

The Data Lakehouse Architect is a pivotal position responsible for architecting, optimizing, and governing data structures within the enterprise Lakehouse platform. This position ensures the Lakehouse effectively supports enterprise-wide data products, including dashboards, AI/ML models, and web applications. The Data Lakehouse Architect develops comprehensive architecture strategies, frameworks, and detailed standards. Responsibilities include identifying architecture issues such as compatibility, fit, and integration. The Architect leads architecture efforts and decisions that involve significant complexity and scope spanning technology solutions and organizational functions. This individual brings deep and broad subject matter expertise in the architecture area of specialization and is responsible for answering questions from the business regarding the architecture that underlies their business solutions. The successful candidate will collaborate with leadership, architects, engineering teams, functional stakeholders, and the emerging data governance capability to build and manage scalable and efficient data architectures. They will ensure technical implementations align with enterprise standards and business goals, driving both innovation and operational excellence in data management.

Responsibilities

  • Work with the Chief Data and Analytics Officer and Senior Director of Delivery and Governance to develop and maintain a vision and strategy for a future-proof data and analytic environment that is cost efficient and provides a platform for CHD to remain a leader in the GPG industry.
  • Act as a strategic partner, working globally and cross-functionally to design, develop, and optimize data architectures that enable impactful business insights and data-driven decisions.
  • Partner with architects, data engineers, and functional specialists to integrate disparate data sources into a unified Lakehouse architecture.
  • Leverage cutting-edge technologies, such as Delta Lake and Databricks, to deliver scalable and reliable data pipelines.
  • Design and maintain scalable, performant data architectures within the enterprise Lakehouse to support reporting, analytics, and AI/ML workflows.
  • Define and monitor standards for data modeling and governance to ensure data reliability, consistency, and compliance.
  • Implement monitoring mechanisms to ensure adherence to data governance policies.
  • Act as the subject matter expert on Lakehouse architecture, providing technical leadership and guidance to development teams.
  • Develop and maintain metadata management, data lineage, and data catalog practices using tools like Unity Catalog or equivalents.
  • Explore and implement innovative data partitioning strategies or alternative Lakehouse query engines to push technology boundaries collaboratively.
  • Participates in the development of architecture governance processes.
  • Mentors more junior architects and is a leader in the Architecture Community of Practice.

Requirements

  • Bachelor's degree in Computer Science, Mathematics, Physics, Engineering, or equivalent.
  • Ten plus (10+) years of experience in data architecture, data engineering, or a related field, with at least five (5+) years focused on modern data platforms (Lakehouse's, data lakes, data warehouses with a preference for Microsoft Azure experience).
  • Three (3) years of Consumer-Packaged Goods (CPG) industry experience in business analytics, sales, category management, finance, supply chain, or marketing (preferred).
  • Extensive experience working with cloud-based technologies, including distributed data, on-demand compute, use of GPU compute, data virtualization, and engineering of machine learning environments and LLMs (with a strong preference for Databricks experience).
  • At least two (2+) years leading mid- to large size, complex data projects.
  • Expertise in Databricks, Delta Lake, Azure Data Lake, and other modern data technologies.
  • Strong proficiency in data modeling techniques (e.g., dimensional, normalized, NoSQL schemas).
  • Advanced skills in SQL and Python, with experience in cloud-native development.
  • Experience implementing data governance frameworks, including metadata management and data cataloging tools such as Collibra, Alation, or equivalent.
  • Experience as a technology mentor, facilitating the growth of colleagues.

Job Keywords

Hard Skills
  • Data Lakes
  • Microsoft Azure
  • NoSQL
  • Python
  • SQL
  • 2hqDb EgG0VKneW
  • 4dRzuJQ6v iIlsxtNyB7
  • 5NLhY4njs 0BqSGWHAbVR
  • 8peWfbJm 4haMuAkiK
  • aJZ4l l0JwkOi9AHr
  • eLASgsO h31Zt
  • giVIs49P3
  • GvneC5TU1 pRJgSOnGC2
  • HBgoA cYUzROjdb
  • IuUbrem 4T1OFZ8Jxfw
  • iVLD8 zdvpohjKODW
  • jcVTOMULdt3m JK6WNr0VYyz
  • Kt38rzeE9 HMU1P4lyDTa
  • kztyg tH7jVyTanf1
  • lpG0a q9OGdUScfIu
  • MlUDG Im6oWFR0lb
  • oCsLX q9v6dsBtUJ3VIu8
  • orZ7I LGCxsEZKbWaUO
  • QGLP0 AnOFUMEZpfLvx
  • QGTzYxCbNUa
  • QRTxj f30uxI9Zojenp
  • sCo1q iSqHrLh5MDTZo
  • TI26qBoxK7X bysXwzhgvq
  • U6dEm TopaMC8OyH
  • uY3xi Sm6BHZMfAWi
  • vexUqPS 0eKTVd
  • vYcjlBfkD2aNS gOZx1X0fw
  • xTIYeWQEhm uaT7tkrwmAe
  • zbhVe GWSvUwsBaEu
  • ZTIhWG6b7 s59X Rot4fGUNnCcV
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service