Lorven Technologies - Irving, TX

posted 3 days ago

Full-time - Senior
Irving, TX
Professional, Scientific, and Technical Services

About the position

The Senior Data Engineer position involves working on a long-term contract project in Irving, TX, focusing on the deployment and scaling of applications in a containerized environment. The role requires extensive experience in data engineering, including real-time and streaming technologies, NoSQL databases, and cloud-based platforms. The ideal candidate will have a strong background in Python, microservices, and data pipeline automation, along with the ability to collaborate with cross-functional teams.

Responsibilities

  • Design and build microservices using Python.
  • Develop and optimize REST APIs.
  • Deploy and scale applications in a containerized environment (Kubernetes, AKS).
  • Implement real-time and streaming technologies (Azure Event Hubs, Kafka, Spark Streaming).
  • Build automated big data pipelines.
  • Perform data analysis and exploration.
  • Work in an agile delivery environment.
  • Collaborate with cross-functional technical teams to align priorities and achieve deliverables.
  • Set coding standards, conduct code reviews, and mentor junior developers.
  • Utilize version control systems (Git) in a multi-developer environment.
  • Orchestrate data pipelines using tools like Airflow and Azure Data Factory.
  • Design data models and system architecture.

Requirements

  • Bachelor's degree in Computer Science or equivalent.
  • 12+ years of relevant experience in data engineering.
  • Strong experience in Python with 3+ years of hands-on coding.
  • 3+ years of experience in designing and building microservices.
  • 3+ years of experience with REST API design and development using Python.
  • 3+ years of experience with NoSQL databases and query tuning/optimization.
  • 3+ years of experience with real-time and streaming technologies.
  • 3+ years of experience with cloud-based platforms (Azure, GCP, AWS).
  • Experience with Snowflake and RDBMS.
  • Experience in building automated big data pipelines.
  • Experience working in an agile delivery environment.
  • Experience with orchestrating pipelines using tools like Airflow or Azure Data Factory.
  • Strong critical thinking, communication, and problem-solving skills.

Nice-to-haves

  • Experience in Azure/GCP.
  • Solid understanding of Kafka architecture and building consumers for high volume data streams.
  • Experience with bash shell scripts, UNIX utilities & UNIX commands.
  • Previous healthcare experience and domain knowledge.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service