Hypergraph-based Multi-instance Contrastive Reinforcement Learning for Annotation-free Pan-cancer Survival Prediction on Whole Slide Histology Images

crossref(2024)

引用 0|浏览1
摘要
The computational challenges inherent to using digital pathology for prognosis derive from the fact that a typical gigapixel slide may consist of thousands of image tiles and previous models lack the capability on modeling the crucial slide-level contextual information, thereby resulting in suboptimal performance. Considering the aforementioned challenges, we propose a Hypergraph-based Multi-instance Contrastive Reinforcement learning model (HeMiCoRe). The HeMiCoRe employs a novel hypergraph construction method that can encode local morphology and spatial relationship provided by the dynamic selected image tiles simultaneously. The HeMiCoRe used the cluster-restricted local and across-cluster global information of WSIs, with hypergraph-level contrastive learning for better generalization. To assess the efficacy of the HeMiCoRe, we conducted extensive internal and external validation experiments for survival prediction on 5,195 whole slide images from 4,624 patients across 10 major cancer types. The HeMiCoRe attains state-of-the-art performance on 8 cancer types and outstanding generalization performance. Furthermore, we demonstrate that the representative patch selector trained by reinforcement learning has the potential to exploit highly relevant image tiles for survival prediction. In summary, the HeMiCoRe is a hypergraph neural network-based model, with enhanced capabilities in capturing both the intra- and inter-cluster interaction mechanism, which makes it outperform existing weakly supervised classification models in pan-cancer survival prediction tasks. This model could be adapted in clinical practice for prognosticating cancer patients and facilitating clinical decision makings.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
0
您的评分 :

暂无评分

数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn