Cognizant Technology Solutions - Charlotte, NC

posted 5 days ago

Full-time - Mid Level
Charlotte, NC
10,001+ employees
Professional, Scientific, and Technical Services

About the position

The Sr. Cloud Data Engineer position at Cognizant involves working with advanced data engineering techniques and tools to support the company's AI and Analytics practice. The role focuses on data cleansing, standardization, and integration using various technologies, primarily in an AWS environment. The ideal candidate will leverage their extensive experience in data engineering to optimize data processes and contribute to the development of data-driven strategies for clients.

Responsibilities

  • Utilize Trillium for data cleansing, standardization, and matching processes with a focus on US Census data file matching.
  • Manage and optimize AWS services including S3, EFS, EBS, Lambda, and IAM roles.
  • Perform data engineering tasks using Databricks, including integrating JSON files from S3 into the raw layer and applying best practices.
  • Develop and maintain Python scripts for data processing and automation.
  • Extract data from various data stores including relational databases and file structures such as CSV, XML, and JSON.
  • Use Teradata utilities (BTEQ, Fast Load, Multi Load) for data extraction and manipulation.
  • Write and maintain Unix shell scripts, including wrapper scripts, and monitor Unix logs for errors.
  • Create and troubleshoot complex SQL queries for backend testing and production issue resolution.
  • Utilize Informatica PowerCenter Client tools (Mapping Designer, Repository Manager, Workflow Manager/Monitor) for ETL processes.

Requirements

  • 8 to 10 years of experience in data engineering roles.
  • Expertise in Trillium Control Centre for cleansing, standardization, and matching processes.
  • Strong knowledge of AWS services (S3, EFS, EBS, Lambda) and IAM roles.
  • Proficient in Databricks data engineering and best practices.
  • Advanced skills in Python programming.
  • Experience with Teradata and its utilities (BTEQ, Fast Load, Multi Load).
  • Proficient in SQL and Unix scripting.
  • Experience with data extraction from various data stores and file structures.
  • Strong proficiency in Informatica PowerCenter Client tools.
  • Excellent problem-solving skills and attention to detail.

Nice-to-haves

  • Good experience with Trillium-based cleansing standardization and US Census data file matching process in Trillium.
  • Experience in interacting with end-users, understanding their requirements, and troubleshooting production issues.
  • Ability to work independently and as part of a team.

Benefits

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 401(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service