This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

ZipRecruiter - Columbus, OH

posted 20 days ago

Full-time - Senior
Columbus, OH

About the position

The Sr. Data Platform Architect will play a crucial role in evaluating and enhancing the company's big data platform and adjacent application stacks. This position involves strategic analysis, design, and communication of both high-level visions and technical directions to ensure the platform's future aligns with business opportunities and technological advancements. The role emphasizes collaboration, problem-solving, and leadership in the Hadoop ecosystem and related technologies.

Responsibilities

  • Evaluate the existing Hadoop and adjacent ecosystem, including its architecture, infrastructure, and performance, to identify challenges in migration and areas of improvement.
  • Develop comprehensive strategies for migration or risk mitigation, including a detailed roadmap, timeline, and resource requirements.
  • Identify technology alternatives where appropriate.
  • Provide leadership to the organization in specific areas of expertise.
  • Assume the technical lead and act as the architect for major company systems or concepts.
  • Participate in major architectural reviews and plans.
  • Ensure that information and trends within their area of expertise are effectively communicated to relevant business units within the company.
  • Prototype systems in anticipation of new requirements.
  • Develop functional requirements from prototype systems.
  • Analyze and solve problems in existing systems.
  • Design, code, and test multiple modules of a system in a timely manner.
  • Ensure that project teams plan and participate in load, capacity, and performance analysis and/or testing.
  • Represent the company's position by participating on or leading relevant standards committees such as ISO, NISO, ACM, and IEEE.
  • Lead the implementation of standards within company systems.
  • Perform other tasks as assigned.

Requirements

  • Master's degree required plus 8 to 12 years of experience at a high technical level of knowledge and experience.
  • Familiarity with alternative big data technologies, such as Apache Kafka, Apache Flink, Apache Cassandra, or cloud-based solutions like Amazon EMR or Google BigQuery.
  • 5+ years of leadership experience with Hadoop and adjacent technologies, such as Hive, Pig, and Spark.
  • Proficiency in programming commonly used in big data processing, such as Java, Scala, or Python.
  • Experience transitioning Hadoop workflows to alternative solutions strongly preferred.
  • Strong understanding of distributed computing principles, experience with large-scale data processing frameworks, and big data concepts in general.
  • Excellent communication and presentation skills.
  • Ability to work independently and as part of a team.
  • Several years' proven experience leading large-scale projects across multiple teams.
  • Experience with multiple complex, business-critical systems is expected.
  • Experience with on-premise big data architectures.
  • Experience with Graph data architectures and Graph databases.

Nice-to-haves

  • Experience with cloud-based big data solutions.
  • Knowledge of data governance and data quality frameworks.

Benefits

  • Work/life balance initiatives
  • Opportunities for professional development
  • Impactful work that contributes to global accessibility of information
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service