Bentley Systems - Exton, PA

posted about 2 months ago

Full-time - Entry Level
Exton, PA
5,001-10,000 employees
Publishing Industries

About the position

We are seeking a Data Engineer to join the Data team at Bentley Systems. This role will help deliver, maintain, and evolve our new enterprise data platform, fulfilling our mission to empower our colleagues with access to trusted data while enabling self-service analytics at scale across the entire organization. The data engineer will translate business requirements into value, embracing a product-focused approach in aiding the creation and roll-out of innovative data products. We are looking for a data and tech enthusiast who can help inspire a data-driven culture across Bentley. In this position, the Data Engineer will be responsible for designing, developing, and maintaining efficient data pipelines to serve applications and stakeholders. The role requires creating excellent engineering solutions in a modern cloud tech stack, establishing and maintaining data quality checks and verifications to ensure the accuracy and reliability of data, and enhancing data processing while optimizing workflows. The Data Engineer will also analyze and tune database queries for performance and understand core database functions that support the data models. The responsibilities extend to designing and implementing data warehouses and data marts, which includes working with end users on requirements, understanding the data that exists in the source systems and constraints, creating data models utilizing facts and dimensions, and conducting end-to-end testing of the model, data mart, and pipelines. Documentation for Data Warehouse/Data Marts is also a key aspect of this role. Collaboration with the engineering team to create integrations to and from the source and target systems, as well as working with the analytics team and business users to identify requirements and develop data products that meet their needs, is essential. The Data Engineer will maintain clear and organized documentation of data-related processes, procedures, and workflows, ensuring accessibility to relevant team members. Additionally, implementing data security policies and best practices through the use of auditing and role-based security (RBAC) is a critical responsibility.

Responsibilities

  • Design, develop, and maintain efficient data pipelines to serve applications and stakeholders.
  • Create excellent engineering solutions in a modern cloud tech stack.
  • Establish and maintain data quality checks and verifications to ensure the accuracy and reliability of data.
  • Enhance data processing and optimize workflows.
  • Analyze and tune database queries for performance and understand core database functions that support the data models.
  • Design and implement data warehouse/data marts, including working with end users on requirements and understanding the data that exists in the source systems and constraints.
  • Create data models utilizing facts and dimensions.
  • Conduct end-to-end testing of the model, data mart, and pipelines.
  • Document processes for Data Warehouse/Data Marts.
  • Collaborate with the engineering team to create integrations to and from the source and target systems.
  • Work with the analytics team and business users to identify requirements and develop data products that meet their needs.
  • Maintain clear and organized documentation of data-related processes, procedures, and workflows.
  • Implement data security policies and best practices through the use of auditing and role-based security (RBAC).

Requirements

  • Experience with data warehouse technologies and relevant data modeling best practices, preferably Snowflake and Databricks.
  • Experience building data pipelines/ETL and familiarity with design principles.
  • Excellent SQL and data manipulation skills using common frameworks.
  • Experience with BI technologies, specifically Power BI and Qlik.
  • Experience with cloud services, specifically Azure.
  • Proficiency in a major programming language (e.g. Python, Scala, Golang).
  • Possess excellent problem-solving skills and critical thinking.
  • Experience with business requirements gathering for data sourcing.
  • Excellent analytical and communication skills to both technical and non-technical stakeholders.
  • Bachelor's/Master's degree in Computer Science, Engineering or related technical field or equivalent industry experience.

Benefits

  • A great Team and culture
  • An exciting career as an integral part of a world-leading software company providing solutions for architecture, engineering, and construction.
  • Competitive Salary and benefits.
  • The opportunity to work within a global and diversely international team.
  • A supportive and collaborative environment.
  • Colleague Recognition Awards.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service