Cynet Systems - Richmond, VA

posted 2 months ago

Full-time - Senior
Remote - Richmond, VA
Professional, Scientific, and Technical Services

About the position

We are seeking a Senior Databricks Cloud Data Engineer to join our team remotely. This role is pivotal in understanding the technology vision and strategic direction of our business needs. The successful candidate will have a comprehensive understanding of our current data model and infrastructure, proactively identifying gaps and areas for improvement. You will be responsible for prescribing architectural recommendations with a focus on performance and accessibility, ensuring that our analytics systems are robust and efficient. In this position, you will partner across engineering teams to design, build, and support the next generation of our analytics systems. Collaborating with business and analytics teams, you will gather specific requirements for data systems that support both the development and deployment of data workloads, which range from Tableau reports to ad hoc analyses. You will own and develop the architecture that translates analytical questions into effective reports that drive business action. Your role will also involve automating and optimizing existing data processing workloads by recognizing patterns of data and technology usage and implementing effective solutions. You will need a solid grasp of the intersection between analytics and engineering, maintaining a proactive approach to ensure that solutions demonstrate high levels of performance, privacy, security, scalability, and reliability upon deployment. Additionally, you will provide guidance to partners on the effective use of the database management systems (DBMS) platform through collaboration, documentation, and adherence to standard methodologies. You will design and build end-to-end automation to support and maintain software currency, creating automation services for builds using Terraform, Python, and OS shell scripts. Developing validation and certification processes through automation tools will also be part of your responsibilities. You will design integrated solutions in alignment with design patterns, blueprints, guidelines, and standard methodologies for products, and participate in developing solutions by incorporating cloud-native and third-party vendor products. Researching and performing proofs of concept (POCs) with emerging technologies will be essential, as will adopting industry best practices in the data space to advance our cloud data platform. You will develop data streaming, migration, and replication solutions, demonstrating leadership, collaboration, exceptional communication, negotiation, strategic, and influencing skills to gain consensus and produce the best solutions. Engaging with senior leadership and business leaders at the client will be crucial to share the business value of your work.

Responsibilities

  • Understand technology vision and strategic direction of business needs.
  • Understand current data model and infrastructure, proactively identify gaps, areas for improvement, and prescribe architectural recommendations.
  • Partner across engineering teams to design, build, and support the next generation of analytics systems.
  • Collaborate with business and analytics teams to understand specific requirements for data systems to support development and deployment of data workloads.
  • Own and develop architecture supporting the translation of analytical questions into effective reports that drive business action.
  • Automate and optimize existing data processing workloads by recognizing patterns of data and technology usage and implementing solutions.
  • Provide guidance to partners on effective use of the database management systems (DBMS) platform through collaboration, documentation, and associated standard methodologies.
  • Design and build end-to-end automation to support and maintain software currency.
  • Create automation services for builds using Terraform, Python, and OS shell scripts.
  • Develop validation and certification process through automation tools.
  • Design integrated solutions in alignment with design patterns, blueprints, guidelines, and standard methodologies for products.
  • Participate in developing solutions by incorporating cloud native and 3rd party vendor products.
  • Participate in research and perform POCs (proofs of concept) with emerging technologies and adopt industry best practices in the data space for advancing the cloud data platform.
  • Develop data streaming, migration and replication solutions.
  • Demonstrate leadership, collaboration, exceptional communication, negotiation, strategic and influencing skills to gain consensus and produce the best solutions.
  • Engage with Senior leadership, business leaders at the Client to share the business value.

Requirements

  • Bachelor's degree in Computer Science, Management Information Systems, Computer Engineering, or related field or equivalent work experience; advanced degree preferred.
  • Seven+ years of experience in designing and building large-scale solutions in an enterprise setting.
  • Three years in designing and building solutions in the cloud.
  • Expertise in building and managing Cloud databases such as AWS RDS, DynamoDB, DocumentDB or analogous architectures.
  • Expertise in building Cloud Database Management Systems in Databricks Lakehouse or analogous architectures.
  • Expertise in Cloud Data Warehouses in Redshift, BigQuery or analogous architectures a plus.
  • Deep SQL expertise, data modeling, and experience with data governance in relational databases.
  • Experience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (SparkSQL, Hadoop, Kafka) distributed technologies.
  • Refined skills using one or more scripting languages (e.g., Python, bash, etc.).
  • Experience using ETL/ELT tools and technologies such as Talend, Informatica a plus.
  • Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime and reliability in mind.
  • Expertise in relational and dimensional data modeling.
  • UNIX admin and general server administration experience required.
  • Presto, Hive, SparkSQL, Cassandra, or Solr other Big Data query and transformation experience a plus.
  • Experience using Spark, Kafka, Hadoop, or similar distributed data technologies a plus.
  • Experience with leveraging CI/CD pipelines.
  • Experience with Agile methodologies and able to work in an Agile manner is preferred.
  • One or more cloud certifications.

Nice-to-haves

  • Experience with Talend or Informatica for ETL/ELT processes.
  • Familiarity with Agile methodologies.

Benefits

  • Remote work flexibility
  • Contract position
  • Opportunity to work with cutting-edge technologies
  • Collaboration with senior leadership and business leaders
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service