From AutoML powered development to cloud-native deployment, MONAI marches forward with four new releases.

MONAI Medical Open Network for AI
4 min readNov 29, 2021

--

MONAI has released three new versions of its existing tools and is introducing a new tool to its deployment offering, introducing MONAI Deploy Inference Server. Researchers can quickly deploy and run MONAI Application Packages (MAPs) as scalable inference on their data using an existing Kubernetes cluster as cloud-native microservices.

MONAI Core v0.8

The first new release is MONAI Core v0.8 which expands on the available learning methods, including Self-Supervised and Multi-Instance learning support. It also includes a new AutoML technique called Differentiable Network Topology Search, or DiNTS, and new visualization techniques for the various transforms already available in MONAI.

Self-Supervised allows us to utilize unlabeled data by generating pre-trained weights using the unlabeled data with self-supervised tasks based on different augmentation types. MONAI Core now provides an example tutorial using the TCIA-Covid19 to generate the pre-trained weights. Those weights are then used with the Beyond the Cranial Vault (BTCV) dataset as the fine-tuning dataset.

Self-Supervised Learning

Multi-Instance Learning (MIL) is a supervised learning technique that uses bags of labeled data versus individually labeled data. MIL is a crucial algorithm for classifying whole slide images (WSI), which can have billions of pixels and requires extraordinary computational and annotation resources. MONAI now includes a new Network Architecture called MILModel, which provides three Multi-Instance Learning modes — mean, max, and attention-based methods. These attention-based methods are based on state-of-the-art research that helps account for dependencies in Deep Learning-based Multiple Instance Learning (https://arxiv.org/abs/2111.01556)

Multi-Instance Learning (MIL) for Whole-Slide Images (WSI)

DiNTS Neural Architecture Search has been applied to search high-performance networks for medical image segmentation. The DiNTS method addresses some of the common issues with large-scale 3D image datasets, like flex multi-path network topologies, high-search efficiency, and GPU memory usage. You can find example notebooks using DiNTS and the Medical Segmentation Decathlon (MSD) datasets to achieve state-of-the-art performance.

Differentiable Neural Network Topology Search (DiNTS)

MONAI Core v0.8 release includes a transformer visualization notebook for existing MONAI transforms, including visualizing images with matplotlib based on MONAI matshow3d API, with TensorBoard-based MONAI plot_2d_or_3d_image API, and with ITKWidgets. It also includes how to blend two images with the same shape.

Transformer Visualization using MONAI MatShow3D API

MONAI Label v0.3

The second new release is MONAI Label v0.3 which includes multi-label segmentation for existing applications, increased performance by supporting multi-GPU training, and better Active Learning User Experience.

First, Multi-label segmentation support includes updating the existing DeepEdit and DeepGrow networks provided by MONAI Label. It now consists of an upgraded UI interface to support multi-label tasks, modifying training scripts to work with multi-label tasks, and creating a robust naming and error system for label name and number associations.

Next, to help increase the performance of the MONAI Label training loop, multi-GPU support has been added to the existing workflows, including updating the data loader. You’re now able to indicate how many GPUs you want to use during the training process.

Last, user experience is an integral part of the training process. To help enable a simple and more intuitive experience for Active Learning, we’ve provided options to allow you to train specific models and allow users to skip images if they feel the current image selection isn’t a good one,

MONAI Deploy

The third new release is MONAI Deploy App SDK v0.2, which includes two new base operators for DICOM interactions: one for DICOM Series Selection and another exporting DICOM Structured Reports SOP for classification results.

Expanding on the MONAI Deploy offerings, a new component called MONAI Inference Service (MIS) has been released. This tool allows researchers to build and deploy a scalable inference server for their data using an existing Kubernetes cluster.

MONAI Inference Service allows for deployment on Kubernetes using Helm

Highlights Include:

  • Register a MAP in the Helm Charts of MIS.
  • Upload inputs via a REST API request and make them available to the MAP container.
  • Provision resources for the MAP container.
  • Provide outputs of the MAP container to the client who made the request.

We’ve also included new MONAI Deploy tutorials that walk you through creating a MAP, deploying MIS, and pushing your MAP to MIS to be run in a Kubernetes cluster.

MONAI Deploy Tutorials: Web-based or Jupyter Notebooks

MONAI continues to expand its core capabilities throughout the Medical AI workflow. Get started today by checking out the notebooks mentioned throughout the sections above, or head over to our GitHub repos to start contributing today!

--

--

MONAI Medical Open Network for AI
MONAI Medical Open Network for AI

Written by MONAI Medical Open Network for AI

MONAI framework is a PyTorch-based open-source foundation for deep learning in healthcare imaging. It is domain-optimized, freely available and community backed

No responses yet