An inference server for your machine learning models, including support for multiple frameworks, multi-model serving and more
-
Updated
Sep 30, 2024 - Python
An inference server for your machine learning models, including support for multiple frameworks, multi-model serving and more
MLOps tutorial using Python, Docker and Kubernetes.
Deploy A/B testing infrastructure in a containerized microservice architecture for Machine Learning applications.
CartPole game by Reinforcement Learning, a journey from training to inference
Tool to take your ML model from local to production with one-line of code.
In this repo it is show how to build and deploy a simple pipeline using Kubernetes, Kubeflow pipelines and seldon-core.
Serve contanerized machine learning models in microservice architecture with seldon-core or Tensorflow Serving
A deployment using Seldon's open source MLServer
Add a description, image, and links to the seldon-core topic page so that developers can more easily learn about it.
To associate your repository with the seldon-core topic, visit your repo's landing page and select "manage topics."