Kroger - Blue Ash, OH

posted about 2 months ago

Full-time
Blue Ash, OH
1,001-5,000 employees
Food and Beverage Retailers

About the position

The Advanced Data Engineer at Kroger is responsible for implementing data solutions in Azure, focusing on enterprise data and information architecture. This role involves analyzing, designing, and developing data deliverables that guide technological responses to business outcomes. The engineer will collaborate with various teams, including 84.51, to create reusable standards and design patterns that enhance the technical infrastructure related to data across the enterprise.

Responsibilities

  • Analyze, design, and develop enterprise data and information architecture deliverables.
  • Lead activities that create deliverables to guide technological responses to business outcomes.
  • Facilitate analysis and design tasks for the development of the enterprise's data architecture.
  • Develop reusable standards, design patterns, guidelines, and configurations for data infrastructure.
  • Collaborate directly with 84.51 to enhance data solutions.
  • Participate in the development and communication of data strategy and roadmaps.
  • Drive the development of enterprise standards for data domains and solutions.
  • Provide technical leadership to ensure clarity between ongoing projects.
  • Drive digital innovation by leveraging new technologies to transform core data assets.
  • Define high-level migration plans to address gaps between current and future states.
  • Present cost/benefit analysis opportunities to leadership for architectural decisions.
  • Lead analysis of the technology environment to detect deficiencies and recommend improvements.
  • Mentor team members in data principles, patterns, processes, and practices.
  • Promote the reuse of data assets and manage the data catalog for reference.

Requirements

  • Bachelor's Degree in computer science, software engineering, or a related field.
  • Demonstrated written and oral communication skills.
  • Knowledge in at least two technical disciplines: data warehousing, data management, analytics development, data science, APIs, data integration, cloud, servers and storage, and database management.
  • Strong knowledge of industry trends and competition.
  • Basic understanding of network and data security architecture.
  • 2 years of experience with Databricks including Unity Catalog.
  • 3 years of experience with PySpark/Spark.
  • 5 years of hands-on experience with data platforms.
  • 5 years of hands-on experience in implementation and performance tuning of MPP databases.
  • Experience with software development and automation methodologies.
  • Experience with data security best practices.
  • Experience in implementing data services in Azure including Azure SQL, Cosmos DB, Databricks, ADLS, Blob Storage, ADF, Azure Stream Analytics.

Nice-to-haves

  • Experience building solutions using elastic architectures (preferably Microsoft Azure and Google Cloud Platform).
  • Experience with data science solutions or platforms.
  • Experience with operational data science, machine learning, or AI solutions.
  • Experience with a variety of SQL, NoSQL, and Big Data Platforms.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service