This job is closed

We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.

Cognizant Technology Solutions - Grove City, OH

posted 2 months ago

Full-time - Mid Level
Remote - Grove City, OH
Professional, Scientific, and Technical Services

About the position

The AWS Data Bricks Engineer position at Cognizant is a remote role focused on data engineering within the AI & Analytics practice. The role requires a highly skilled individual with extensive experience in AWS Databricks, Python, and various data management tools. The engineer will be responsible for data cleansing, standardization, and integration, utilizing AWS services and ensuring efficient data processing and automation. This position is critical for transforming data into meaningful intelligence for clients.

Responsibilities

  • Utilize Trillium for data cleansing standardization and matching processes with a focus on US Census data file matching.
  • Manage and optimize AWS services including S3, EFS, EBS, Lambda, and IAM roles.
  • Perform data engineering tasks using Databricks including integrating JSON files from S3 into the raw layer and applying best practices.
  • Develop and maintain Python scripts for data processing and automation.
  • Extract data from various data stores including relational databases and file structures such as CSV, XML, and JSON.
  • Use Teradata utilities (BTEQ, Fast Load, Multi Load) for data extraction and manipulation.
  • Write and maintain Unix shell scripts including wrapper scripts and monitor Unix logs for errors.
  • Create and troubleshoot complex SQL queries for backend testing and production issue resolution.
  • Utilize Informatica PowerCenter Client tools (Mapping Designer, Repository Manager, Workflow Manager/Monitor) for ETL processes.

Requirements

  • 8 to 10 years of experience in data engineering roles.
  • Expertise in Trillium Control Centre cleansing, standardization, and matching processes.
  • Strong knowledge of AWS services (S3, EFS, EBS, Lambda) and IAM roles.
  • Proficient in Databricks data engineering and best practices.
  • Advanced skills in Python programming.
  • Experience with Teradata and its utilities (BTEQ, Fast Load, Multi Load).
  • Proficient in SQL and Unix scripting.
  • Experience with data extraction from various data stores and file structures.
  • Strong proficiency in Informatica PowerCenter Client tools.
  • Excellent problem-solving skills and attention to detail.

Nice-to-haves

  • Well-organized, quick learner, and self-motivated.
  • Effective verbal and written communication skills.
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service