Vision-based personalized Wireless Capsule Endoscopy for smart healthcare: Taxonomy, literature review, opportunities and challenges

Khan Muhammad, Salman Khan, Neeraj Kumar, Javier Del Ser, Seyedali Mirjalili

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

Wireless Capsule Endoscopy (WCE) is a patient-friendly approach for digestive tract monitoring to support medical experts towards identifying any anomaly inside human's Gastrointestinal (GI) tract. The automatic recognition of such type of abnormalities is essential for early diagnosis and time saving. To this end, several computer aided diagnosis (CAD) methods have been proposed in the literature for automatic abnormal region segmentation, summarization, classification, and personalization in WCE videos. In this work, we provide a detailed review of computer vision-based methods for WCE videos analysis. Firstly, all the major domains of WCE video analytics with their generic flow are identified. Secondly, we comprehensively review WCE video analysis methods and surveys with their pros and cons presented to date. In addition, this paper reviews several representative public datasets used for the performance assessment of WCE techniques and methods. Finally, the most important aspect of this survey is the identification of several research trends and open issues in different domains of WCE, with an emphasis placed on future research directions towards smarter healthcare and personalization.

Original languageEnglish
Pages (from-to)266-280
Number of pages15
JournalFuture Generation Computer Systems
Volume113
DOIs
Publication statusPublished - Dec 2020

Keywords

  • Artificial intelligence
  • Biomedical data analysis
  • Data science
  • Health monitoring
  • Smart healthcare
  • Wireless Capsule Endoscopy

Fingerprint Dive into the research topics of 'Vision-based personalized Wireless Capsule Endoscopy for smart healthcare: Taxonomy, literature review, opportunities and challenges'. Together they form a unique fingerprint.

Cite this