BGSF - Owings Mills, MD

posted 2 months ago

Full-time
Owings Mills, MD
Administrative and Support Services

About the position

The Log Data Engineer position is a critical role within our Information Technology team, focusing on the management and optimization of log data platforms, specifically Splunk and Cribl. This position involves supporting large hybrid deployments through all lifecycle stages, including requirements gathering, design, testing, implementation, operations, and documentation. The ideal candidate will be responsible for implementing and automating log data pipelines using Python, ensuring efficient ingestion into platforms like Splunk and OpenSearch. In addition to pipeline automation, the Log Data Engineer will automate platform management tasks using tools such as Ansible. Troubleshooting issues that impact log data platforms is a key responsibility, requiring the engineer to coordinate with platform users and develop comprehensive training and documentation materials. The role also includes supporting log data platform upgrades, which involves coordinating testing efforts and ensuring smooth transitions. The engineer will gather and process data from various sources using scripts, APIs, and SQL, and will build pipelines for log data engineering and testing to enhance overall system performance and reliability. This position requires a proactive approach to problem-solving and the ability to work independently with minimal guidance. The Log Data Engineer will collaborate across teams to influence software design and operations, ensuring that security, performance, and disaster recovery best practices are adhered to throughout the process.

Responsibilities

  • Support large hybrid Splunk and Cribl deployments through all lifecycle stages (requirements, design, testing, implementation, operations, and documentation).
  • Implement and automate log data pipelines using Python for ingestion into platforms like Splunk and OpenSearch.
  • Automate platform management with Ansible or similar tools.
  • Troubleshoot issues impacting log data platforms.
  • Coordinate with platform users and develop training/documentation materials.
  • Support log data platform upgrades, including testing coordination.
  • Gather and process data from various sources using scripts, APIs, and SQL.
  • Build pipelines for log data engineering and testing.

Requirements

  • 3-5 years managing and configuring Splunk Enterprise/Cloud.
  • Proficiency in Cribl.
  • Experience with Linux/Windows agents for log data engineering.
  • Cloud-based solution development using AWS.
  • Data onboarding, configuration, dashboard creation, and extraction in Splunk and Cribl.
  • Strong scripting and automation skills (bash, Python, etc.).
  • Familiarity with Splunk REST APIs, cloud platforms (prefer AWS), and container technologies.
  • Experience with data pipeline orchestration platforms.

Nice-to-haves

  • Splunk Certification (Admin or Architect).
  • Ansible Tower automation.
  • GitLab experience.
  • Large platform migration experience.
  • AWS OpenSearch and Cribl expertise.
  • Familiarity with data streaming technologies (Kafka, Kinesis, Spark Streaming).
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service