Pivotal Solutions - Fargo, ND

posted 7 days ago

Full-time
Fargo, ND
Professional, Scientific, and Technical Services

About the position

The DevOps/Data Engineer at Pivotal Solutions, Inc. is responsible for driving the infrastructure strategy and managing the AWS infrastructure. This role involves overseeing the maintenance and growth of the data lakehouse infrastructure, championing security practices, and providing leadership on infrastructure and big data topics. The engineer will also participate in engineering support rotations and adopt high-level technologies that impact the organization.

Responsibilities

  • Drive the infrastructure strategy alongside the rest of the DevOps team
  • Build, manage, and automate AWS infrastructure
  • Oversee the maintenance and growth of the data lakehouse infrastructure and implementation
  • Champion security by adopting necessary tools and processes
  • Provide leadership to the entire engineering team on various infrastructure and big data topics
  • Identify and adopt high-level technologies, processes, and services that impact the organization on a system-wide scope
  • Participate in the L2 rotation for engineering support

Requirements

  • 8+ years of experience as a DevOps Engineer
  • 5+ years of experience with the AWS cloud platform, including proficiency with AWS SDK, AWS CLI/UI, and services such as VPC, Route53, Private and Public subnets, route tables, IGW, EC2 Instances, CloudFront, API Gateway, IAM, ELB, Autoscaling, CloudWatch, EFS, NFS, EBS, S3, RDS, Lambda, SQS/SNS, Kafka, Security groups, etc.
  • 5+ years of experience with Terraform/Terragrunt
  • 5+ years of experience with event processing infrastructure
  • 2+ years of experience as a Data Engineer with Databricks, including data pipelines, Debezium, Spark, and data modeling
  • Expert proficiency with Python
  • Model serving experience
  • Expert proficiency in deploying, maintaining, and scaling applications
  • Proficiency with Docker, Kubernetes, and Serverless functions
  • Expert proficiency in setting up CI/CD pipelines (tools/processes/governance)
  • Experience creating and maintaining fully automated CI/CD pipelines for code deployment using tools such as Git, CircleCI, AWS Code build, and Code Deploy / Code pipeline
  • Proficient with both Relational and Non-Relational databases
  • Experience with Security vulnerability scanning tools (Guard Duty, Inspector)
  • Experience building Monitoring dashboards and leveraging tools such as Cloudwatch, Datadog, and Sumologic
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service