DPR Construction - Washington, DC

posted 2 months ago

Full-time - Mid Level
Washington, DC
10,001+ employees
Construction of Buildings

About the position

DPR is seeking an experienced Data Engineer to join our Data Engineering team. This role is part of the Data Engineering and AI team and will ensure DPR is moving towards data-driven decisions based on modern data engineering techniques, applications, and AI. The right candidate will be excited by the prospect of optimizing or even re-designing our company's data architecture to support our next generation of products and data initiatives. Having experience working with Supply Chain, Procurement, Commercial Construction, Manufacturing, and Prefabrication will be a huge plus. In this position, you will participate in and collaborate with cross-functional workgroups and functional teams to align Data Engineering efforts and resources with business goals and objectives. You will drive strategic conversations with stakeholders to fully understand and document pain points and business requirements, define the key deliverables required to improve business processes, and develop required integrations while applying other data engineering approaches. Building and maintaining relationships with business stakeholders will be crucial, as you will need to develop a deep understanding of their processes, tools, and goals. Your responsibilities will include designing, building, and maintaining robust data pipelines and architectures, ensuring high availability and reliability. You will develop complex data models and algorithms to extract insights from large datasets related to Supply Chain, Procurement, and Construction. Utilizing Snowflake for efficient data storage and Azure Data Factory for orchestrating and automating data workflows will be essential. You will also script and program in Python and use DBT for transformation tasks to optimize data processes. Your role will involve implementing solutions from a cloud-first perspective using Agile, Scrum, and Data Ops methodologies, as well as assembling data sets that meet functional and non-functional business requirements. Additionally, you will design and implement process improvements, automate manual processes, optimize data pipelines, and scale the data infrastructure while maintaining data integrity and compliance with industry standards and best practices. Staying updated on emerging trends and technologies in data engineering and construction tech will also be part of your responsibilities.

Responsibilities

  • Participate in and collaborate with cross-functional workgroups and functional teams to align Data Engineering efforts and resources with business goals and objectives.
  • Drive strategic conversations with stakeholders to fully understand and document pain points and business requirements.
  • Define the key deliverables required to improve business processes and develop required integrations.
  • Develop and maintain relationships with business stakeholders and a deep understanding of their processes, tools, and goals.
  • Design, build, and maintain robust data pipelines and architectures, ensuring high availability and reliability.
  • Develop complex data models and algorithms to extract insights from large Supply Chain, Procurement, and Construction datasets.
  • Utilize Snowflake for efficient data storage, data warehousing, and Azure Data Factory for orchestrating and automating data workflows.
  • Script and program in Python and use DBT for transformation tasks to optimize data processes.
  • Assemble data sets that meet functional/non-functional business requirements.
  • Design and implement process improvements like automating manual processes, optimizing data pipeline from source to consumption, and scaling the data infrastructure.
  • Collaborate closely with stakeholders to understand data requirements and translate these into technical solutions.
  • Maintain data integrity and compliance, adhering to industry standards and best practices.
  • Stay on top of emerging trends and technologies in data engineering and construction tech.

Requirements

  • Bachelor's/Master's in Computer Science or a related technical field.
  • 4+ years prior experience as a Data Engineer/Integrations Engineer in a fast-paced, technical, problem-solving environment.
  • 2+ years of experience with a public cloud (AWS/Microsoft Azure/Google Cloud).
  • 2+ years of data warehousing experience (Redshift or Snowflake).
  • 2+ years of experience with Agile engineering practices.
  • Experience with enterprise data lakes, data warehouses, data marts, and big data.
  • Expert-level proficiency in Azure Data Factory, Integration Platforms, Python programming, DBT, and data modeling.
  • Demonstrated experience with API development and management for data integration.
  • Advanced SQL knowledge and experience working with relational databases, query authoring (SQL), and working familiarity with various databases.
  • Strong analytic skills related to working with unstructured datasets.
  • Strong project management and organizational skills.
  • Experience with SQL, JSON and XML, LookML.
  • Knowledge of APIs, REST, and GraphQL.
  • Experience with programming languages like Python.
  • Exposure to the Construction Industry is a huge plus.
  • Excellent communication skills to ask questions, clarify requirements, and engage with the team and stakeholders.
  • Strong logic, reasoning, and critical thinking skills to solve problems as they arise.
  • Must be an independent problem solver who can evaluate a situation and build solution options.

Nice-to-haves

  • Experience working with Supply Chain, Procurement, Commercial Construction, Manufacturing, and Prefabrication.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service