Automatic ML Model Containerization

Automatic ML Model Containerization

Containerizing machine learning models can be a pain. This talk covers a new open-source approach to building machine learning (ML) models into container images to run in production for inference. Chassis.ml and the Open Model Interface are changing the game with a...
Hardware Accelerators for ML Inference

Hardware Accelerators for ML Inference

There are many different types of hardware that can accelerate ML computations – CPUs, GPUs, TPUs, FPGAs, ASICs, and more. Learn more about the different types, when to use them, and how they can be used to speed up ML inference and the performance of ML...