National Grid - Hicksville, NY
posted 2 months ago
As a Staff Data Engineer at National Grid, you will play a pivotal role in our Product Engineering department, focusing on making critical data accessible to our business teams. This position is available in various locations including Waltham, MA, Brooklyn, NY, Hicksville, NY, or Syracuse, NY, and candidates from nearby states such as Connecticut, New Jersey, New Hampshire, Pennsylvania, Rhode Island, Vermont, or Maine are also welcome to apply. Your work will be integral to the US IT Electric business unit, where innovation and adaptability are key. In this role, you will be part of a Data Engineering team that utilizes the agile framework to build end-to-end data pipelines. You will adhere to rigorous engineering standards and coding practices to ensure that the data delivered is of the highest quality. Your contributions will also extend to modernizing our architecture and tools, enhancing our output, scalability, and speed. You will design and develop highly scalable and extensible data pipelines that facilitate the collection, storage, distribution, modeling, and analysis of large datasets from various channels. Key responsibilities include leading the Data Engineering team in developing, testing, documenting, and supporting scalable data pipelines. You will build new data integrations, including APIs, to accommodate the increasing volume and complexity of data. Additionally, you will implement scalable solutions that align with our data governance standards and architectural roadmaps for data integrations, storage, reporting, and analytics. Collaboration with analytics and business teams will be essential to improve data models that enhance business intelligence tools, thereby fostering data-driven decision-making across the organization. You will also design and develop data integrations and a data quality framework, write unit/integration/functional tests, and document your work. Furthermore, you will automate the deployment of our distributed system for collecting and processing streaming events from multiple sources, perform data analysis to troubleshoot data-related issues, and guide junior engineers on coding best practices and optimization.