AI-based image segmentation and labeling with free open source software, 3D Slicer
Demo selection form
Link to vote for the demos:
https://forms.gle/k551zxgQPN51yiTH7
Material link: https://drive.google.com/drive/folders/1_o4epAKasD37eCHx02D7Ro-1hsTuxEHc?usp=sharing
Participants need to download 3D Slicer and all materials before the tutorial as the internet connection may be limited!
Description
This tutorial aims to help new, intermediate and seasoned 3D Slicer users alike to harness the various image segmentation and annotation tools effectively, and in doing so open up the capabilities of AI-assisted medical image computing technologies to a wider audience. Importantly, considering that this being the first MICCAI conference on the African continent, our tutorial is intended to align with this year’s wider objective of low-cost medical image computing and computer assisted interventions.
Attendees will start by learning about the objectives and impact of 3D Slicer through a plenary talk delivered by an invited speaker. This will be followed by a brief introductory session to segmentation within 3D Slicer, during which attendees will follow along on their laptops and learn how to use built-in tools to segment sample data. Next we will explore AI-based segmentation tools and teach attendees how to deploy segmentation models on their own computers. Finally, the tutorial will close with a hands-on demonstration session in which attendees will have the opportunity to see and use clinical research systems that integrate segmentation and AI. To maximize the utility and interest of the hands-on demonstration portion of the tutorial, attendees will be asked to vote on possible topics during workshop registration.
The core learning outcomes of the tutorial will include:
- How to load and visualize data using 3D Slicer
- How to use various built-in and AI-based segmentation tools to annotate data
- How to utilize annotated data and segmentation tools within clinical applications of interest
Keynote presentation
TBN
Tentative Schedule
- 45 min Short keynote talk (plenary)
TBD - 45 min 3D Slicer Introductory Session (plenary)
This session will cover basic 3D Slicer usage, including loading and visualizing medical imaging data, as well as basic usage of the built-in segmentation module. - 30min Coffee break
- 1 hour AI-based segmentation and labeling (most likely plenary)
AI-based segmentation tools, & training and deploying a deep neural network for segmentation in 3D Slicer (TotalSegmentator and/or other tools by preference of the audience.) - Time remaining
Hands-on demos and instruction based on attendee interest (in small break-out groups)
Hands-on Demos
AR Surgical Planning with HoloLens and 3D Slicer
The following demos describe a novel connection approach between 3D Slicer and Microsoft HoloLens 2 using OpenIGTLink communication protocol. This integration combines the strengths of both tools to improve visualization and interaction with medical images. Specifically, the AR glasses display medical images received from 3D Slicer in real time.
1. Augmented Reality and 3D Slicer for dental implants placement planning
Video: https://www.youtube.com/watch?v=1LDxDY3gOXQ
Dental implants are structures of synthetic materials that are placed into the oral gums to replace missing teeth. Accurate planning of the surgical process is vital to ensuring a high success rate and preventing inadvertent damage.
In this demo, we will show how the utilization of Microsoft HoloLens 2 can be of great value in planning the placement of dental implants in an intuitive, fast, and harmless manner.
2. Augmented Reality and 3D Slicer for pedicle screws placement planning
Video: https://www.youtube.com/watch?v=35WiSceP94Q
Most spinal deformities caused by scoliosis, fractures, tumors, and many degenerative diseases, are treated with pedicle screws. This procedure involves fixing the spine with screws that provide strength and scaffolding for bony fusion. This is a challenging technique due to the proximity of sensitive anatomical structures. Improper screw placement can result in vascular and nerve injuries, as well as compromised screw retention and short-term implant failure.
AR can enhance spinal instrumentation efficiency, safety, and accuracy. This demo shows how the combination of Microsoft HoloLens 2 and 3D Slicer can lead to improved and faster planning of pedicle screws placement using 3D models and medical images based on the patient data.
Surgical Skills Analysis with tracking systems, AR and 3D Slicer
1. Needle insertion for sacral nerve stimulation.
Link to paper with videos: https://www.sciencedirect.com/science/article/pii/S016926072200373X#ecom0001
Sacral nerve stimulation (SNS) is a minimally invasive procedure where an electrode lead is implanted through the sacral foramina to stimulate the nerve modulating colonic and urinary functions. One of the most crucial steps in SNS procedures is the placement of the tined lead close to the sacral nerve. However, needle insertion is very challenging for surgeons and success is highly dependent on their experience. Several x-ray projections are required to interpret the needle position correctly. In many cases, multiple punctures are needed, causing an increase in surgical time and patient’s discomfort and pain.
In this demo, we will show two different navigation systems based, one on an optical tracking system, and the other on augmented reality. The idea of this solution is to provide a training setup for needle insertion and also a surgical guidance solution for real procedures to reduce surgical time, minimize patient discomfort and improve surgical outcomes.
2. Birth Delivery Training.
Video of tracker and simulator: https://www.youtube.com/watch?v=1co0CtjTIlY&t=77s
Video of tracker and simulator with HoloLens: https://www.youtube.com/watch?v=rlEyJWCXmk8
The World Health Organization recommends a rate of cesareans inferior than 15%. However, the actual rates in the US double this value, while the use of obstetrical instruments, a recommended alternative to cesareans but which requires high skill and experience, has significantly decreased in the latest years. In this context there is a clear demand for simulators, with special interest in learning the correct use of Kielland’s forceps.
In this demo we present a virtual instrumented simulator to improve training in the correct use of forceps proposing a three-step protocol which guides users along the process while evaluating their performance. We will use a birth simulator together with an electromagnetic tracking system. Additionally, we will show how HoloLens 2 can be used to visualize the models displayed in the 3D Slicer scene in real time.
Nousnav
NousNav Website: https://www.nousnav.org/
Computer-assisted neurosurgical navigation systems have become a critical component of modern neurosurgery, but many of these tools remain inaccessible in lower resource settings due to their high cost, dependence on consumables, and need for technical support. NousNav is a low-cost neuronavigation system built with the goal of democratizing access to computer-assisted navigation during neurosurgery. It uses open-source software, open-source hardware designs, and inexpensive, off-the-shelf components to provide a complete platform for planning and performing image-guided neurosurgery using optical tracking.
In this demo, we present the latest version of NousNav’s hardware and software in an interactive, hands-on format. Attendees will have the opportunity to use NousNav to plan a neurosurgical procedure, register the system to an anthropomorphic manikin, and use their plan to execute a simulated craniotomy. This demo will cover the basic principles of neurosurgical navigation as well as serve as a practical example of how 3D Slicer can be extended into a custom surgical platform. Ongoing work related to AI-based segmentation and surgical planning will be highlighted as well.
Ultrasound depth camera tracking
Conventional tracking systems require external optical or electromagnetic sensors to be attached to surgical instruments which can be intrusive to surgeons. Using a combination of color and depth imaging we can track instruments such as ultrasound probes using AI-based segmentations.
In this demo participants we present methods for streaming segmentation predictions in real-time to 3D Slicer using a trained YOLOv8 model. Using these segmentations we will project a point cloud of depth points on the ultrasound probe which can be registered to 3D models of the ultrasound probe.
Organizing committee
Technical Organizers / Instructors:
- Rebecca Hisey, PhD Student (English, French)
Queen’s University, Kingston, Canada - Colton Barr, MD/PhD Student
Queen’s University, Kingston, Canada - Mónica García Sevilla, Postdoc (Spanish, English)
Universidad Carlos III de Madrid, Spain - Gabriella d’Albenzio
Queen’s University, Kingston, Canada
Instructors:
- Sidaty El Hadramy, PhD Student (French, Arabic, English)
University of Strasbourg, Strasbourg, France - Idrissa Seck, PhD Student (French, English)
École Superieur Polytechnique, Dakar, Sénégal - Alicia Pose-Díez-de-la-Lastra, Postdoc (Spanish, English)
Universidad Carlos III de Madrid, Spain - Felix von Haxthausen, Postdoc (German, English)
Universidad Carlos III de Madrid, Spain
Program Advisors:
- Gabor Fichtinger, Dr. Univ., Professor, MICCAI Fellow
Queen’s University, Kingston, CAN - Ron Kikinis, MD, Professor, MICCAI Fellow
Harvard Brigham Women’s Hospital, Boston, USA - Sonia Pujol, PhD , Assistant Professor
Harvard Brigham Women’s Hospital, Boston, USA - Javier Pascau, PhD, Professor
Universidad Carlos III de Madrid, Spain