Kaseya - Atlanta, GA

posted about 2 months ago

Full-time - Mid Level
Atlanta, GA
Professional, Scientific, and Technical Services

About the position

Kaseya is seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team in Atlanta, Georgia. As a leading provider of complete IT infrastructure and security management solutions, Kaseya is committed to delivering exceptional value to our customers through innovative technology and a strong focus on data management. The Senior Data Engineer will play a critical role in designing, developing, and maintaining our data infrastructure, ensuring that data is accessible, reliable, and valuable to our business operations. This position requires extensive experience with Snowflake, ETL tools, and offshore coordination, making it essential for the candidate to have a strong technical background and the ability to work collaboratively with cross-functional teams. In this role, you will be responsible for designing, developing, and implementing data pipelines and ETL processes using industry-standard tools and best practices. You will utilize Snowflake for data warehousing, ensuring optimal performance, scalability, and reliability. Additionally, you will leverage Azure cloud services for data storage, processing, and integration, ensuring efficient and secure data handling. Collaboration with cross-functional teams to understand data requirements and deliver solutions that meet business needs will be a key aspect of your responsibilities. You will also coordinate with offshore teams to ensure seamless integration and alignment of data engineering activities. Monitoring and troubleshooting data pipeline issues will be part of your daily tasks, ensuring data integrity and availability. You will implement data governance and best practices for data management and security, optimizing and maintaining existing data infrastructure to ensure efficient data processing and storage. As a Senior Data Engineer, you will provide technical leadership and mentorship to junior data engineers and team members, staying up-to-date with emerging technologies and industry trends to recommend and implement improvements.

Responsibilities

  • Design, develop, and implement data pipelines and ETL processes using industry-standard tools and best practices.
  • Utilize Snowflake for data warehousing, ensuring optimal performance, scalability, and reliability.
  • Leverage Azure cloud services for data storage, processing, and integration, ensuring efficient and secure data handling.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.
  • Coordinate with offshore teams to ensure seamless integration and alignment of data engineering activities.
  • Monitor and troubleshoot data pipeline issues, ensuring data integrity and availability.
  • Implement data governance and best practices for data management and security.
  • Optimize and maintain existing data infrastructure, ensuring efficient data processing and storage.
  • Provide technical leadership and mentorship to junior data engineers and team members.
  • Stay up-to-date with emerging technologies and industry trends, recommending and implementing improvements.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in data engineering or a related field.
  • Proficiency in Snowflake, including data modeling, performance tuning, and query optimization.
  • Strong experience with ETL tools (e.g., Matillion, Informatica, Talend, Apache NiFi) and data integration techniques.
  • Proven experience with Azure cloud services, including Azure Data Factory, Azure SQL Database, and Azure Blob Storage.
  • Proven experience in coordinating and collaborating with offshore teams.
  • Proficient in SQL and other programming languages (e.g., Python, Java, Scala).
  • Solid understanding of data warehousing concepts and best practices.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and interpersonal skills, with the ability to work effectively in a team-oriented environment.
  • Knowledge of DevOps practices and tools (e.g., Docker, Kubernetes, CI/CD pipelines).
  • Familiarity with machine learning concepts and tools.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service