Capgemini - Atlanta, GA

posted 26 days ago

Full-time - Mid Level
Atlanta, GA
10,001+ employees
Professional, Scientific, and Technical Services

About the position

The Snowflake Developer role at Capgemini involves architecting, designing, coding, and testing data integration activities, primarily using Snowflake and other big data technologies. The position requires proficiency in scripting with Python and performance tuning of Snowflake pipelines, along with strong communication skills to present solutions effectively. The developer will work collaboratively within a team to implement fully operational solutions and optimize performance across various data technologies.

Responsibilities

  • Experience in data integration activities, including architecting, designing, coding, and testing phases.
  • Architect data warehouses and provide guidance to the team for implementation using Snowflake, Hadoop, or other big data technologies.
  • Proficient in scripting techniques using Python.
  • Experience in performance tuning of Snowflake pipelines and troubleshooting issues.
  • Ability to demonstrate proposed solutions with excellent communication and presentation skills.
  • Experience designing and implementing fully operational solutions on Snowflake Data Warehouse or Hadoop.
  • Experience with Python and a major relational database.
  • Strong presentation and communication skills, both written and verbal.
  • Ability to problem-solve and convert requirements into design.
  • Experience optimizing the performance of Spark jobs.

Requirements

  • Experience in data integration activities, including architecting, designing, coding, and testing phases.
  • Proficient in scripting techniques using Python.
  • Experience in performance tuning of Snowflake pipelines and troubleshooting issues.
  • Strong presentation and communication skills, both written and verbal.
  • Ability to problem-solve and convert requirements into design.
  • Experience designing and implementing fully operational solutions on Snowflake Data Warehouse or Hadoop.

Nice-to-haves

  • Experience with Hadoop and other big data technologies.
  • Experience optimizing the performance of Spark jobs.

Benefits

  • Flexible work
  • Healthcare including dental, vision, mental health, and well-being programs
  • Financial well-being programs such as 401(k) and Employee Share Ownership Plan
  • Paid time off and paid holidays
  • Paid parental leave
  • Family building benefits like adoption assistance, surrogacy, and cryopreservation
  • Social well-being benefits like subsidized back-up child/elder care and tutoring
  • Mentoring, coaching and learning programs
  • Employee Resource Groups
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service