Machine Learning Engineer

$150,000 - $175,000/Yr

Unclassified - Seattle, WA

posted about 1 month ago

Full-time - Mid Level
Seattle, WA

About the position

Allio Capital is seeking a skilled Machine Learning Engineer with a strong background in data engineering to develop and implement advanced machine learning models that enhance the company's products and services. The role involves building data pipelines, handling large datasets, and creating scalable machine learning solutions, particularly in the context of financial and economic time-series data.

Responsibilities

  • Design and develop robust data pipelines and ETL processes to support machine learning models.
  • Ensure data quality and integrity across various data sources.
  • Work with data warehousing solutions, particularly Snowflake, to manage and optimize data storage.
  • Collect, preprocess, and analyze financial and economic time-series data.
  • Apply time-series modeling techniques to extract insights and forecast trends.
  • Build and deploy machine learning models and algorithms tailored to business needs.
  • Conduct experiments to test hypotheses and validate models.
  • Optimize machine learning algorithms for performance and scalability.
  • Implement techniques to improve model accuracy and reduce computational costs.
  • Collaborate with software engineers to integrate machine learning models into production systems.
  • Ensure models are robust, scalable, and maintainable in real-world applications.
  • Stay current with the latest developments in machine learning and data engineering.
  • Explore new technologies and methodologies to enhance existing solutions.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, or a related field.
  • Proven experience as a Machine Learning Engineer or in a similar role.
  • Strong programming skills in Python.
  • Experience with machine learning frameworks such as TensorFlow, PyTorch, XGBoost, LightGBM, or scikit-learn.
  • Proficiency in data engineering tools and techniques, including SQL, NoSQL databases, and ETL processes.
  • Experience with data warehousing platforms, particularly Snowflake.
  • Familiarity with big data technologies like Apache Spark or Hadoop.
  • Strong understanding of statistical modeling, data mining, and data visualization techniques.
  • Experience working with financial and/or economic time-series data.
  • Experience with software development tools and version control systems (e.g., Git).
  • Excellent problem-solving skills and attention to detail.

Nice-to-haves

  • Experience with cloud platforms like AWS, Google Cloud, or Azure.
  • Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes).
  • Familiarity with data pipeline tools such as Apache Airflow or Luigi.
  • Strong communication skills and the ability to work collaboratively in a team environment.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service