Aptino - Philadelphia, PA

posted 4 months ago

Full-time
Philadelphia, PA
Professional, Scientific, and Technical Services

About the position

We are seeking a talented and motivated Machine Learning Engineer to join our team in Philadelphia, PA. This is an exciting opportunity for individuals who are passionate about leveraging machine learning and artificial intelligence to solve complex problems. The ideal candidate will have a strong background in Python and SQL, which are essential for performing various data tasks. You will be responsible for designing, building, and maintaining machine learning models and data pipelines that will drive our AI initiatives. In this role, you will work extensively with AWS services, including SageMaker, Lambda, Glue, S3, IAM, CodeCommit, CodePipeline, and Bedrock. Your expertise in these tools will be crucial for deploying and managing machine learning models in a cloud environment. You will also utilize data pipeline tools such as Apache Airflow or AWS Step Functions to orchestrate ETL processes and ensure efficient data flow. As a Machine Learning Engineer, you will be expected to implement MLOps practices, including model monitoring, data drift detection, and automation to enhance the performance and reliability of our machine learning solutions. Familiarity with containerization technologies like Docker and AWS ECR will also be beneficial as you work to create scalable and reproducible environments for your models. This position is contractual and requires onsite presence in Philadelphia, PA.

Responsibilities

  • Design, build, and maintain machine learning models and data pipelines.
  • Utilize Python and SQL for data tasks and analysis.
  • Work with AWS services such as SageMaker, Lambda, Glue, S3, IAM, CodeCommit, CodePipeline, and Bedrock.
  • Implement ETL processes and manage data warehousing solutions.
  • Utilize data pipeline tools like Apache Airflow or AWS Step Functions.
  • Apply MLOps practices including model monitoring and data drift detection.
  • Automate machine learning workflows and processes.
  • Leverage AI/ML tools such as TensorFlow, PyTorch, and MLflow for model development.
  • Containerize applications using Docker and manage them with AWS ECR.

Requirements

  • Proficiency in Python and SQL for data manipulation and analysis.
  • Experience with AWS services relevant to machine learning and data processing.
  • Familiarity with data pipeline tools such as Apache Airflow or AWS Step Functions.
  • Knowledge of ETL processes and data warehousing concepts.
  • Experience with AI/ML tools like TensorFlow, PyTorch, and MLflow.
  • Understanding of MLOps practices including model monitoring and automation.
  • Experience with containerization technologies like Docker and AWS ECR.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service