AWS Data Engineer

$77,200 - $168,900/Yr

Cgi Technologies And Solutions

posted 2 months ago

Full-time - Mid Level
Hybrid
10,001+ employees
Professional, Scientific, and Technical Services

About the position

The AWS Data Engineer at CGI will play a crucial role in a high-volume data ingestion and search modernization project within the Judicial Sector. This position involves creating and managing data ingestion pipelines, automating data analysis, and developing knowledge bases using advanced AI models. The engineer will also focus on data architecture, pipeline version control, and data quality management, applying software engineering best practices and leveraging cloud technologies to deliver value to clients.

Responsibilities

  • Work with the technical development team and team leader to understand desired application capabilities.
  • Setup ingestion pipelines that pull new and updated artifacts into knowledge bases.
  • Continuously improve machine learning models.
  • Design and apply data architectures for field-based and semantic search.
  • Prioritize accurate or more recent artifacts within knowledge bases.
  • Collaborate with Agile teams to test and support technical solutions across development tools and technologies.
  • Develop applications in AWS data and analytics technologies including OpenSearch, RDS, S3, Athena, Lambda, Step Functions, Glue, Sagemaker, and others.
  • Integrate open-source components into data-analytic solutions.
  • Work with vendors to enhance tool capabilities.

Requirements

  • Proficient in navigating the AWS console and programmatic interaction with AWS through the AWS Python SDKs and AWS CLI.
  • Hands-on experience with AWS services such as RDS, S3, Lambda, Step Functions, Glue, SQS, SNS, CloudTrail, CloudWatch, VPC, EC2, and IAM.
  • Proficiency in Python, including data structures, custom classes/modules, object-oriented code organization, and database/API interaction.
  • Deep experience troubleshooting complex end-to-end data processing issues.
  • Hands-on experience with high-volume data application development and version control systems like Git.
  • Experience implementing data ingestion processes incorporating ETL processes.
  • Experience in data modeling and relational database design of large datasets.
  • Knowledge of application development lifecycles and continuous integration/deployment practices.
  • 7-10 years experience delivering and operating large-scale, highly-visible distributed systems.

Nice-to-haves

  • DevOps practices: IaC for pipelines, pipeline monitoring and logging, code versioning, data versioning, container writing.
  • Experience with the Atlassian toolset (Jira, Confluence).
  • Experience with DynamoDB or other NoSQL databases; Redshift.
  • API design; API Gateway experience.
  • ElasticSearch/OpenSearch experience.
  • AWS Certifications.
  • Agile or SAFe Certification.

Benefits

  • Competitive compensation
  • Comprehensive insurance options
  • Matching contributions through the 401(k) plan and the share purchase plan
  • Paid time off for vacation, holidays, and sick time
  • Paid maternity and parental leave
  • Learning opportunities and tuition assistance
  • Member assistance and wellness programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service