Senior Data Engineer

$115,000 - $128,000/Yr

Randstad - Folsom, CA

posted about 2 months ago

Full-time - Mid Level
Folsom, CA
Administrative and Support Services

About the position

We have a Direct Hire Opportunity for a Senior Data Engineer in Folsom, CA. This position will work a hybrid schedule with roughly 50% onsite depending on the business needs. The Senior Data Engineer will be responsible for enforcing standards, data engineering principles, and practices while designing, building, deploying, automating, and maintaining end-to-end data pipelines for new and existing data sources and targets utilizing modern ETL/ELT tools and practices, including stream processing technologies where appropriate. This role requires a strong problem-solving ability that allows the team to resolve issues in a timely and effective manner. The Senior Data Engineer will drive and complete project deliverables within the data engineering and management area according to project plans. The ideal candidate will utilize in-depth technical expertise regarding data models, data analysis and design, master data management, metadata management, reference data management, data warehousing, business intelligence, and data quality improvement. They should have the ability to influence internal clients to leverage standard capabilities and make data-driven decisions. The role also involves working with internal technical resources to optimize the data Lakehouse through hardware or software upgrades or enhancements. The Senior Data Engineer will design and implement data models that balance performance, flexibility, and ease of use, considering both analytical and operational needs. They will enable and support self-service analytics by designing intuitive data models and views, collaborating with the Business Intelligence team to ensure data is easily accessible and interpretable for business partners. Additionally, the Senior Data Engineer will manage and automate the deployment of upgrades, patches, and new features across the data infrastructure, ensuring minimal disruption to data services and maintaining system integrity. They will conduct code reviews and approvals for data pipelines developed and implemented by the team, ensuring compliance with all data Lakehouse administration activities. The role requires designing and managing the implementation of data models to meet user specifications while adhering to prescribed standards. The Senior Data Engineer will also manage and collect business metadata and data integration points, coordinate with business analysts to prepare data design for systems, analyze user requirements, and prepare technical design specifications to address user needs. Furthermore, they will develop and implement comprehensive testing strategies, including automated unit, integration, and end-to-end tests, to ensure the accuracy, reliability, and performance of data pipelines and procedures. The position also involves enforcing standards and procedures to ensure data is managed consistently and properly integrated within the Lakehouse, creating and maintaining thorough, up-to-date documentation for all data engineering projects, processes, and systems, and implementing business rules via coding, stored procedures, middleware, or other technologies to ensure scalability and maintainability of implemented solutions. Finally, the Senior Data Engineer will analyze processes in specialty areas to isolate and correct problems and improve workflow, while implementing and maintaining robust data quality assurance processes to ensure the integrity, accuracy, and reliability of data across all stages of processing and storage.

Responsibilities

  • Enforce standards, data engineering principles and practices.
  • Design, build, deploy, automate, and maintain end-to-end data pipelines for new and existing data sources and targets utilizing modern ETL/ELT tools and practices, including stream processing technologies where appropriate.
  • Demonstrates problem solving ability that allows team for timely and effective issue resolution.
  • Drives and completes project deliverables within the data engineering & management area according to project plans.
  • Utilize in-depth technical expertise regarding data models, data analysis and design, master data management, metadata management, reference data management, data warehousing, business intelligence, and data quality improvement.
  • Ability to influence internal clients to leverage standard capabilities and data driven decisions.
  • Work with internal technical resources to optimize the data Lakehouse through hardware or software upgrades or enhancements.
  • Design and implement data models that balance performance, flexibility, and ease of use, considering both analytical and operational needs.
  • Enable and support self-service analytics by designing intuitive data models and views, collaborating with the Business Intelligence team to ensure data is easily accessible and interpretable for business partners.
  • Work with vendors to troubleshoot and resolve system problems, providing on-call support as required.
  • Manage and automate the deployment of upgrades, patches, and new features across the data infrastructure, ensuring minimal disruption to data services and maintaining system integrity.
  • Conduct code review and approvals for data pipelines developed and implemented by team.
  • Ensure compliance to all data Lakehouse administration activities.
  • Design and manage implementation of data models to meet user specifications, while adhering to prescribed standards.
  • Manage and collect business metadata and data integration points.
  • Coordinate with business analysts and prepare data design for systems; analyze user requirements; prepare technical design specifications to address user needs.
  • Develop and implement comprehensive testing strategies, including automated unit, integration, and end-to-end tests, to ensure the accuracy, reliability, and performance of data pipelines and procedures.
  • Provide technical support and coordination during Lakehouse design, testing, and movement to production.
  • Enforce standards and procedures to ensure data is managed consistently and properly integrated within the Lakehouse.
  • Create and maintain thorough, up-to-date documentation for all data engineering projects, processes, and systems, adhering to organizational standards and leveraging modern documentation tools and practices.
  • Implement business rules via coding, stored procedures, middleware, or other technologies ensuring scalability and maintainability of implemented solutions.
  • Analyze processes in specialty areas to isolate and correct problems and improve workflow.
  • Implement and maintain robust data quality assurance processes, including automated checks and balances, to ensure the integrity, accuracy, and reliability of data across all stages of processing and storage.

Requirements

  • Bachelor's degree in a related field.
  • Advanced proficiency in Python and SQL.
  • Proficiency in Java or Scala.
  • Familiarity with R for statistical computing.
  • Strong experience with at least one major cloud platform (AWS, Azure, or Google Cloud Platform).
  • Understanding of cloud-native architectures and services.
  • Experience with modern data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery).
  • Familiarity with data lake architectures and technologies (e.g., Delta Lake, Apache Hudi).
  • Proficiency in designing and implementing scalable data pipelines.
  • Experience with ETL/ELT tools (e.g., Apache Airflow, AWS Glue, Databricks).
  • Strong knowledge of both relational (e.g., PostgreSQL, Oracle) and NoSQL (e.g., MongoDB, Cassandra) databases.
  • Experience with database optimization and performance tuning.
  • Proficiency in dimensional modeling and data warehouse design.
  • Experience with data modeling tools.
  • Proficiency with Git and GitHub/GitLab.
  • Experience with CI/CD pipelines for data projects.
  • Familiarity with Docker and container orchestration (e.g., Kubernetes).
  • Understanding of data governance principles and practices.
  • Knowledge of data security and privacy best practices.
  • Familiarity with MLOps practices and tools.
  • Experience working in Agile environments (e.g., Scrum, Kanban).
  • Proficiency with project management tools (e.g., Jira, Confluence).
  • Basic proficiency with data visualization tools (e.g., Power BI, Tableau).
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service