AWS Data Engineer

$77,200 - $168,900/Yr

CGI - Lafayette, LA

posted 2 months ago

Full-time - Mid Level
Hybrid - Lafayette, LA
10,001+ employees
Professional, Scientific, and Technical Services

About the position

The AWS Data Engineer at CGI will play a crucial role in a high-volume data ingestion and search modernization project within the Judicial Sector. This position involves creating and managing data ingestion pipelines, automating data analysis, and developing knowledge bases using advanced AI models. The engineer will also focus on data architecture, version control, data lineage tracking, and ensuring data quality throughout the automated processes. The role emphasizes the application of software engineering best practices and the use of cutting-edge cloud technologies to deliver value to clients.

Responsibilities

  • Work with the technical development team and team leader to understand desired application capabilities.
  • Setup ingestion pipelines that pull new and updated artifacts into knowledge bases.
  • Continuously improve machine learning models.
  • Design and apply data architectures for field-based and semantic search.
  • Prioritize accurate or more recent artifacts within knowledge bases.
  • Collaborate with Agile teams to test and support technical solutions across development tools and technologies.
  • Develop applications using AWS data and analytics technologies such as OpenSearch, RDS, S3, Athena, Lambda, and others.
  • Integrate open-source components into data-analytic solutions.
  • Work with vendors to enhance tool capabilities.

Requirements

  • Proficient in navigating the AWS console and programmatic interaction with AWS through Python SDKs and AWS CLI.
  • Hands-on experience with AWS services: RDS, S3, Lambda, Step Functions, Glue, SQS, SNS, CloudTrail, CloudWatch, VPC, EC2, and IAM.
  • Proficiency in Python, including data structures, custom classes/modules, and database/API interaction.
  • Deep experience troubleshooting complex end-to-end data processing issues.
  • Hands-on experience with high-volume data application development and version control systems like Git.
  • Experience implementing data ingestion processes incorporating ETL.
  • Experience in data modeling and relational database design of large datasets.
  • Knowledge of application development lifecycles and continuous integration/deployment practices.
  • 7-10 years of experience delivering and operating large-scale distributed systems.

Nice-to-haves

  • Knowledge of IaC using Terraform.
  • Agile development experience.
  • DevOps practices for pipelines, monitoring, and logging.
  • Experience with the Atlassian toolset (Jira, Confluence).
  • Experience with DynamoDB or other NoSQL databases; Redshift.
  • API design and API Gateway experience.
  • ElasticSearch/OpenSearch experience.
  • AWS Certifications.
  • Agile or SAFe Certification.

Benefits

  • Competitive compensation
  • Comprehensive insurance options
  • Matching contributions through the 401(k) plan and share purchase plan
  • Paid time off for vacation, holidays, and sick time
  • Paid maternity and parental leave
  • Learning opportunities and tuition assistance
  • Member assistance and wellness programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service