Petadata - Las Vegas, NV

posted 2 months ago

Full-time - Mid Level
Remote - Las Vegas, NV

About the position

We are seeking a highly skilled Data Engineer with a strong emphasis on Python programming. The ideal candidate will possess extensive experience in object-oriented programming (OOP) and be proficient in building, understanding, and utilizing packages, modules, objects, and methods. This role will require you to develop and maintain APIs using FastAPI and Pydantic models, work with SQLAlchemy and ORM, and manage data workflows with tools like Airflow, Snowflake, Spark, ETL, and ELT. Strong analytical skills are also essential for this position.

Responsibilities

  • Design, develop, and maintain robust data pipelines and workflows.
  • Write clean, efficient, and reusable Python code with a strong emphasis on object-oriented principles.
  • Develop and manage APIs using FastAPI and Pydantic models.
  • Work with SQLAlchemy and ORM to interact with databases.
  • Implement data processing workflows using Apache Airflow.
  • Manage and optimize data storage and retrieval using Snowflake.
  • Utilize Apache Spark for large-scale data processing.
  • Perform ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) operations to ensure data integrity and accessibility.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions.
  • Analyze data to provide actionable insights and support decision-making processes.

Requirements

  • 10+ years of IT experience with 8+ years of experience in Data Engineering.
  • In-depth knowledge and min 4+ years of experience in Amazon RDS-Aurora PostgreSQL and advanced Python knowledge.
  • Experience working with Agile and cloud services, SQL/NoSQL databases, and Docker/Kubernetes.
  • Knowledge about Integration development using AWS or any other cloud technologies.
  • Ability to design, architect, implement, and support 'key datasets' that provide structured and timely access to actionable business insights.
  • Experience developing ETL processes that convert data into formats through a team of data analysts and dashboard charts.
  • Extensive knowledge of task creation, such as scheduled tasks, triggered tasks, etc.
  • Pipeline monitoring and troubleshooting experience.
  • Good written and verbal communication skills.

Benefits

  • Professional work environment with opportunities for growth in the Information technology world.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service