Cognizant Technology Solutions - Philadelphia, PA

posted 2 months ago

Full-time - Mid Level
Philadelphia, PA
10,001+ employees
Professional, Scientific, and Technical Services

About the position

Cognizant is seeking a skilled Data Engineer with expertise in Pyspark and Snowflake to join our team in Philadelphia, PA. This position is pivotal in designing, developing, and maintaining scalable data pipelines and frameworks in both cloud platforms and on-premises environments. The ideal candidate will have a strong background in building generic ELT frameworks, performing proofs of concept (POCs) and assessments on new tools and technologies, and collaborating closely with stakeholders to understand and achieve business goals. In the role of Data Engineer, you will be responsible for creating robust data solutions that enhance our clients' ability to make informed decisions based on data insights. You will work within the AI & Analytics practice, which focuses on leveraging artificial intelligence and data analytics to transform business operations. Your contributions will help decode customer needs and preferences, enabling our clients to refine their services and products effectively. The position requires a flexible approach to data structures and a deep understanding of data architecture, allowing you to quickly transform data resources into meaningful intelligence. You will also be expected to understand the Comcast ecosystem to recommend innovative design ideas that align with business objectives. This role is essential for driving the success of our clients by ensuring that their data strategies are both effective and efficient.

Responsibilities

  • Build generic ELT frameworks in cloud platforms and on-premises environments.
  • Design frameworks and platforms, improving existing designs.
  • Perform POCs and assessments on new tools and technologies.
  • Design and develop data pipelines.
  • Work with stakeholders to understand business goals and help achieve them.
  • Understand the Comcast ecosystem to recommend design ideas.

Requirements

  • Experience with Pyspark and Snowflake.
  • Proven ability to design and develop scalable data pipelines.
  • Strong understanding of data warehousing concepts and practices.
  • Experience in building generic ELT frameworks.
  • Ability to perform POCs and assessments on new tools and technologies.
  • Excellent communication skills to work with stakeholders.

Nice-to-haves

  • Familiarity with cloud platforms such as AWS or Azure.
  • Experience in data modeling and data architecture.
  • Knowledge of machine learning concepts and applications.

Benefits

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 401(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service