WSP Global - New York, NY

posted about 2 months ago

Full-time
New York, NY
Professional, Scientific, and Technical Services

About the position

This opportunity at WSP is currently initiating a search for a Data Engineer for our Bridge Inspection Group at the Penn Plaza, New York office. The selected candidate will be involved in projects with our Bridge Inspection Team and will be part of a growing organization that meets our client's objectives and solves their challenges. This role is responsible for the design, deployment, development, and maintenance of data solutions. The Data Engineer will participate in a variety of data-related projects and work closely with business users and other stakeholders to gather requirements and build data pipelines that meet the organization's needs. The Data Engineer will deploy and support data platforms that process and store data from structural engineering, bridge design, and bridge inspection in the required format. This position will contribute to software development methods, tools, and techniques, applying agreed standards and tools to achieve well-engineered outcomes. Responsibilities include designing, developing, and implementing data solutions to meet business requirements and data ingestion needs, facilitating accurate and timely data availability for analysis and decision-making. Key tasks involve extracting, loading, and transforming (ELT/ETL) data from various sources, including on-premises and cloud-based systems, APIs, databases, and files. The Data Engineer will also develop APIs as needed to transform data into the required format from various sources of bridge design and inspection. Writing well-designed, efficient code that adheres to security standards is essential, as is monitoring and troubleshooting data pipelines to identify and resolve issues promptly to minimize disruptions in data processing for both on-premises and cloud environments. Additionally, the role requires implementing data quality checks and validation processes to ensure the accuracy and completeness of data, writing complex queries and scripts to efficiently manipulate, transform, and process raw data from bridge inspection, and creating and executing data validation processes to ensure the reliability and consistency of incoming data. The Data Engineer will continuously optimize data pipelines for performance, scalability, and reliability, create and maintain technical documentation, and contribute to building and maintaining data catalog and lineage. The position also involves designing and developing CI/CD processes that ensure high availability and agility, developing cloud data services provisioning automation with integrated capabilities of IAM, network, security policies as code, and observability. Staying updated with the latest trends and best practices in data engineering, cloud computing, and Azure services to suggest innovative solutions to continually improve the organization's data intelligence capabilities is crucial. Finally, the Data Engineer will monitor and report on supplier performance, customer satisfaction, adherence to security requirements, and market intelligence.

Responsibilities

  • Design, deploy, develop, and maintain data solutions.
  • Participate in data-related projects and gather requirements from business users and stakeholders.
  • Deploy and support data platforms for structural engineering, bridge design, and inspection.
  • Contribute to software development methods, tools, and techniques.
  • Design, develop, and implement data solutions to meet business requirements.
  • Extract, load, and transform (ELT/ETL) data from various sources.
  • Develop APIs to transform data from various sources of bridge design and inspection.
  • Write well-designed, efficient code adhering to security standards.
  • Monitor and troubleshoot data pipelines to resolve issues promptly.
  • Implement data quality checks and validation processes.
  • Write complex queries and scripts to manipulate and process raw data.
  • Create and execute data validation processes for incoming data reliability.
  • Continuously optimize data pipelines for performance, scalability, and reliability.
  • Create and maintain technical documentation.
  • Contribute to building and maintaining data catalog and lineage.
  • Design and develop CI/CD processes for high availability and agility.
  • Develop cloud data services provisioning automation with integrated capabilities.
  • Build tools and services for data discovery, lineage, resiliency, and privacy compliance.
  • Stay updated with trends in data engineering and cloud computing.

Requirements

  • Bachelor's Degree; Master's degree is a plus.
  • At least 3 years of relevant experience.
  • Prior experience in handling and processing data for structural design and bridge design and inspection is preferred.
  • Strong knowledge of Big Data architectures, large data warehouses, and Data Lake solutions.
  • Experience designing and implementing Modern Data and Analytics solutions including Lakehouse architecture and medallion architecture.
  • Proficient in cloud services including Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Data Lake Store, Microsoft Purview, and Power BI.
  • Experience with major database platforms including Oracle, SQL Server, Cloud databases, and NoSQL Databases.
  • Strong understanding of data engineering concepts, ELT/ETL principles, and data modeling.
  • Experience with data integration techniques for both structured and unstructured data.
  • Solid programming skills in languages such as Python, Pyspark, and SQL.
  • Experience with Airflow.
  • Experience in DevOps, Git Repos, and CI/CD pipelines for code deployment.
  • Experience deploying and administering cloud-based data solutions using infrastructure-as-code and infra-automation tools like Terraform, Ansible, etc.
  • Strong knowledge of Microsoft Azure Cloud.
  • Functional knowledge of Microsoft Power BI.
  • Experience with Jira, Confluence.
  • Proficiency in at least one software engineering methodology, including Agile, Scrum, DevOps, Extreme Programming (XP), Kanban, Lean, and Rapid Application Development (RAD).

Nice-to-haves

  • Master's Degree
  • Experience applying structured validation and testing methods, including Unit Testing, Integration Testing, System Testing, Acceptance Testing, and Regression Testing.

Benefits

  • Medical coverage
  • Dental coverage
  • Vision coverage
  • Disability coverage
  • Life insurance
  • Retirement savings
  • Paid sick leave
  • Paid vacation or personal time
  • Paid parental leave
  • Paid time off for bereavement, voting, and naturalization proceedings
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service