Surgical Insight-guided Deep Learning for Colorectal Lesion Management


Tatar O. C., Çubukçu A.

Surgical Laparoscopy, Endoscopy and Percutaneous Techniques, cilt.34, sa.6, ss.559-565, 2024 (SCI-Expanded) identifier identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 34 Sayı: 6
  • Basım Tarihi: 2024
  • Doi Numarası: 10.1097/sle.0000000000001298
  • Dergi Adı: Surgical Laparoscopy, Endoscopy and Percutaneous Techniques
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, CINAHL, MEDLINE
  • Sayfa Sayıları: ss.559-565
  • Anahtar Kelimeler: artificial intelligence, colorectal cancer, convolutional neural network, endoscopy polyp
  • Kocaeli Üniversitesi Adresli: Evet

Özet

Background: Colonoscopy stands as a pivotal diagnostic tool in identifying gastrointestinal diseases, including potentially malignant tumors. The procedure, however, faces challenges in the precise identification of lesions during visual inspections. The recent strides in AI and machine learning technologies have opened avenues for enhanced medical imaging analysis, including in the field of colonoscopy. Methods: In this study, we developed and evaluated a deep learning (DL) model, ColoNet, for detecting lesions in colonoscopic images. We analyzed 1760 images from 306 patients who underwent colorectal surgery between 2009 and 2022, meeting specific inclusion criteria. These images were used to train and validate ColoNet, employing the YOLOv8 architecture and various data augmentation techniques. Deep learning metrics are assessed via YOLO architecture and trained model diagnostic accuracy was assessed via sensitivity, specifity, positive predictive value, and negative predictive value. Results: Our results from the validation dataset revealed a precision of 0.79604, a recall of 0.78086, an mAP50 of 0.83243, and an mAP50-95 of 0.4439. In addition, on a separate real-time dataset of 91 images consisting both healthy and suspect lesions, ColoNet achieved a sensitivity of 70.73%, specificity of 92.00%, positive predictive value (PPV) of 87.88%, and negative predictive value (NPV) of 79.31%. The positive and negative likelihood ratios were 8.84 and 0.32, respectively, with an overall accuracy of 82.42%. Conclusions: In conclusion, our model has demonstrated promising results, indicating its potential as a valuable tool to assist surgeons during colonoscopy procedures. Its ability to detect suspicious lesions with potential malignancy offers a noteworthy advancement in the early diagnosis and management of colorectal cancers. Further multicentric, prospective research and validation are warranted to fully realize its clinical applicability and impact.