Yahoo Holdings - Champaign, IL
posted 2 months ago
The Senior Software Development Engineer at Yahoo will play a crucial role in analyzing, designing, programming, debugging, and modifying software enhancements and new products. This position involves leading the development of data warehouse designs in collaboration with a team of Big Data engineers. The engineer will work in an agile Scrum-driven environment, focusing on delivering innovative products that meet the needs of the business. Responsibilities include designing applications, writing code, developing and testing software, debugging, and documenting work and results. Staying up-to-date with relevant technology is essential to maintain and improve the functionality of the applications developed. In this role, the engineer will be responsible for designing and implementing reusable frameworks, libraries, and Java components, as well as product features in collaboration with business and IT stakeholders. The position requires ingesting data from various structured and unstructured data sources into Hadoop and other distributed Big Data systems. The engineer will support the sustainment and delivery of an automated ETL pipeline, validate data extracted from sources like HDFS, databases, and other repositories, and enrich and transform extracted data as required. Monitoring and reporting the data flow through the ETL process, performing data extractions, data purges, or data fixes in accordance with internal procedures and policies, and tracking development and operational support via user stories and technical tasks in issue tracking software are also key responsibilities. Additionally, the engineer will troubleshoot production support issues post-deployment and mentor junior engineers within the team. This position requires a strong background in back-end programming, experience with large-scale databases, and familiarity with various database technologies and tools. The ideal candidate will possess excellent analytical and problem-solving skills, particularly in the Big Data domain, and have a proven understanding of Hadoop and cloud providers like AWS, GCP, and Azure.