(August, 2019)
(September, 2019)
Line 11: Line 11:
 
==Highlights==
 
==Highlights==
  
== September, 2019 ==
+
==January, 2020 ==
  
[[File:Opening_ICAI.jpg]]
+
[[File:Lancet_II.png]]
  
 +
Currently, the Gleason score, assigned by a pathologist after microscopic examination of cancer morphology, is the most powerful prognostic marker for prostate cancer patients. However, it suffers from significant inter- and intra-observer variability, limiting its usefulness for individual patients.
 +
This problem could be solved by the fully automated deep learning system developed by [https://www.computationalpathologygroup.eu/members/wouter-bulten/ Wouter Bulten] and his colleagues. Their work appeared [https://www.thelancet.com/journals/lanonc/article/PIIS1470-2045(19)30739-9/fulltext online] in [https://www.thelancet.com/journals/lanonc/home The Lancet Oncology].
  
On the 16th of September the official opening event of the [https://icai.ai/thira-lab/ Thira Lab] and [https://www.ai-for-health.nl/ Radboud AI for Health Lab] look place in the Tuinzaal of [https://www.radboudumc.nl/patientenzorg Radboudumc].  Witnessed by many interested attendees, Radboudumc’s  Chair of the Executive Board prof. Paul Smits opened the first two  Nijmegen-based labs within the nationwide Innovation Center of Artificial Intelligence ([https://icai.ai/ ICAI]).
 
  
 +
The system was developed using 5759 biopsies from 1243 patients. A semi-automatic labeling technique was used to circumvent the need for manual pixel-level annotations. The study was focussed on the full range of Gleason grades, and evaluated on a large cohort of patients with an expert consensus reference standard and an external tissue microarray test set.
  
'''Thira Lab'''
 
  
[https://icai.ai/thira-lab/ Thira Lab] is a collaboration between Radboudumc and [https://thirona.eu/ Thirona], a spin-out company from Radboudumc, and [https://www.delft.care/ Delft Imaging Systems], a company developing healthcare solutions for the specific needs of vulnerable communities around the world. In Thira Lab, nine Ph.D. candidates and post-docs from Radboudumc work on deep learning image analysis of CT scans, radiographs and retinal images.  
+
In the figure above the development of the deep learning system is depicted. The authors employed a semi-automated method of labeling the training data (top row), removing the need for manual annotations by pathologists. The final system can assign Gleason growth patterns on a cell-level and achieved a high agreement with the reference standard (quadratic kappa 0.918). In a separate observer experiment, the deep learning system outperformed 10 out of 15 pathologists in agreement with the reference standard. The system was validated on an external test set where it achieved an AUC of 0.977 on distinguishing between benign and malignant biopsies and an AUC of 0.871 using grade group 2 as a cut-off.
  
  
'''Radboud AI for Health'''
+
Click [https://www.computationalpathologygroup.eu/software/automated-gleason-grading/ here] to try Wouter's algorithm on your own data and learn more about the project on automated Gleason grading.
  
[https://www.ai-for-health.nl/ Radboud AI for Health Lab] is a new collaboration between [https://www.ru.nl/ Radboud University] and Radboudumc, and is part of Radboud AI, a campus-wide initiative to improve collaboration and start new projects with AI researchers in Nijmegen. Radboud AI for Health has awarded [https://www.ai-for-health.nl/phd_projects/ 6 Ph.D. positions], aimed to bring a variety of AI solutions to the clinic. Radboud AI for Health, located in the Radboudumc Innovation Space, will also house [https://www.ai-for-health.nl/student_projects/ BSc and MSc students] who perform AI research projects in collaboration with Radboudumc clinicians. Finally, the Lab offers courses to Radboudumc employees who would like to learn more about the application of AI in healthcare.
 
  
 
More [[Featured Image Archive|Research Highlights]].
 
More [[Featured Image Archive|Research Highlights]].

Revision as of 12:22, 9 January 2020

Diagnostic Image Analysis Group

The Diagnostic Image Analysis Group is part of the Departments of Radiology and Nuclear Medicine, Pathology, and Ophthalmology of Radboud University Medical Center. We develop computer algorithms to aid clinicians in the interpretation of medical images and thereby improve the diagnostic process.

The group has its roots in computer-aided detection of breast cancer in mammograms, and we have expanded to automated detection and diagnosis in breast MRI, ultrasound and tomosynthesis, chest radiographs and chest CT, prostate MRI, neuro-imaging and the analysis of retinal and digital pathology images. The technology we primarily use is deep learning.

It is our goal to have a significant impact on healthcare by bringing our technology to the clinic. We are therefore fully certified to develop, maintain, and distribute software for analysis of medical images in a quality controlled environment (MDD Annex II and ISO 13485).

On this site you find information about the history of the group and our collaborations, an overview of people in DIAG, current projects, publications and theses, contact information, and info for those interested to join our team.

Highlights

January, 2020

Lancet II.png

Currently, the Gleason score, assigned by a pathologist after microscopic examination of cancer morphology, is the most powerful prognostic marker for prostate cancer patients. However, it suffers from significant inter- and intra-observer variability, limiting its usefulness for individual patients. This problem could be solved by the fully automated deep learning system developed by Wouter Bulten and his colleagues. Their work appeared online in The Lancet Oncology.


The system was developed using 5759 biopsies from 1243 patients. A semi-automatic labeling technique was used to circumvent the need for manual pixel-level annotations. The study was focussed on the full range of Gleason grades, and evaluated on a large cohort of patients with an expert consensus reference standard and an external tissue microarray test set.


In the figure above the development of the deep learning system is depicted. The authors employed a semi-automated method of labeling the training data (top row), removing the need for manual annotations by pathologists. The final system can assign Gleason growth patterns on a cell-level and achieved a high agreement with the reference standard (quadratic kappa 0.918). In a separate observer experiment, the deep learning system outperformed 10 out of 15 pathologists in agreement with the reference standard. The system was validated on an external test set where it achieved an AUC of 0.977 on distinguishing between benign and malignant biopsies and an AUC of 0.871 using grade group 2 as a cut-off.


Click here to try Wouter's algorithm on your own data and learn more about the project on automated Gleason grading.


More Research Highlights.

News

  • June 12, 2020 - June 12, During the Euroson 2020 webinar, Thomas van den Heuvel won the Young Investigator Award from the European Federation of Societies for Ultrasound in Medicine and Biology with his abstract entitled: “Introducing prenatal ultrasound screening in research-limited settings using artificial intelligence”.
  • March 18, 2020 - The defense of Midas Meijs' PhD thesis titled 'Automated Image Analysis and Machine Learning to Detect Cerebral Vascular Pathology in 4D-CTA' has been postponed because of COVID19. A new date will follow.

More News.