We are looking for self-motivated, responsive individuals who are passionate about data. In this role, you will build data solutions to address complex business questions, taking data through its lifecycle—from the pipeline for data processing and data infrastructure to creating dataset data products. You will be responsible for designing and building ETL jobs to support the Enterprise Data Warehouse, ensuring that data is processed efficiently and effectively to meet business needs. Your core responsibilities will include writing Extract-Transform-Load (ETL) jobs using standard tools and partnering with business teams to understand their requirements. You will assess the impact of these requirements on existing systems and design and implement new data provisioning pipeline processes specifically for Finance and External reporting domains. Monitoring and troubleshooting operational or data issues in the data pipelines will also be a key part of your role, as will driving architectural plans and implementations for future data storage, reporting, and analytic solutions. This position requires a strong background in data engineering, with a focus on big data processing technologies. You will need to leverage your experience to optimize SQL queries in a business environment that deals with large-scale, complex datasets. A detailed knowledge of databases and data warehouse concepts, as well as hands-on experience with cloud technologies, will be essential for success in this role.