Cigniti - San Diego, CA

posted about 1 month ago

Full-time - Senior
San Diego, CA
Professional, Scientific, and Technical Services

About the position

The Architect position focuses on designing, building, and maintaining scalable cloud data architectures and automated solutions. The role involves collaborating with various teams to implement end-to-end data pipelines, ensuring data quality, security, and governance, while driving the adoption of cloud best practices.

Responsibilities

  • Design, build, and maintain scalable cloud data architectures and automated solutions that support data ingestion, processing, and storage.
  • Develop automation frameworks and tools for deploying and managing cloud infrastructure and data services.
  • Collaborate with data engineers, DevOps teams, and data scientists to implement end-to-end data pipelines and workflows.
  • Ensure automation of data quality, security, and governance processes, enabling seamless integration across cloud environments.
  • Drive the implementation of cloud best practices, including monitoring, disaster recovery, security, and cost optimization.
  • Lead efforts in automating testing, deployment, and monitoring of data infrastructure.
  • Evaluate and recommend tools and technologies to improve automation processes for cloud data platforms.
  • Provide technical leadership, mentoring, and knowledge sharing within the team.

Requirements

  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • 7+ years of experience in data architecture, cloud infrastructure, and automation.
  • Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud, with deep expertise in services such as AWS Lambda, S3, RDS, EMR, Azure Data Factory, Databricks, etc.
  • Hands-on experience in Python, PySpark, PyTest, Docker.
  • Strong knowledge in Kafka and MongoDB.
  • Strong knowledge of infrastructure-as-code (IaC) tools such as Terraform, CloudFormation, or Ansible.
  • Proven expertise in automating data pipelines using tools like Apache Airflow, Jenkins, or similar CI/CD tools.
  • Proficiency in scripting and programming languages (e.g., Python, Bash, Java).
  • Experience with big data technologies such as Hadoop, Spark, or Kafka.
  • Strong knowledge of DevOps methodologies and containerization tools (e.g., Docker, Kubernetes).
  • Deep understanding of data security, privacy, and compliance in cloud environments.
  • Excellent problem-solving and communication skills.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service