[1]杨启鸣,朱 旗,王明明,等.基于联邦知识蒸馏的多站点脑疾病诊断方法[J].南京师范大学学报(工程技术版),2023,23(01):018-24.[doi:10.3969/j.issn.1672-1292.2023.01.003]
 Yang Qiming,Zhu Qi,Wang Mingming,et al.Multi-Site Brain Disease Diagnosis Method Based on Federal Knowledge Distillation[J].Journal of Nanjing Normal University(Engineering and Technology),2023,23(01):018-24.[doi:10.3969/j.issn.1672-1292.2023.01.003]
点击复制

基于联邦知识蒸馏的多站点脑疾病诊断方法
分享到:

南京师范大学学报(工程技术版)[ISSN:1006-6977/CN:61-1281/TN]

卷:
23卷
期数:
2023年01期
页码:
018-24
栏目:
计算机科学与技术
出版日期:
2023-03-15

文章信息/Info

Title:
Multi-Site Brain Disease Diagnosis Method Based on Federal Knowledge Distillation
文章编号:
1672-1292(2023)01-0018-07
作者:
杨启鸣1朱 旗1王明明1孙 凯2朱 敏3邵 伟1张道强1
(1.南京航空航天大学计算机科学与技术学院,江苏 南京 211106) (2.深圳市华赛睿飞智能科技有限公司,广东 深圳 518063) (3.南京航空航天大学公共实验教学部,江苏 南京 211106)
Author(s):
Yang Qiming1Zhu Qi1Wang Mingming1Sun Kai2Zhu Min3Shao Wei1Zhang Daoqiang1
(1.College of Computer Science and Technology,Nanjing University of Aeronautics and Astronautics,Nanjing 211106,China) (2.Shenzhen Huasai Ruifei Intelligent Technology Co.,Ltd.,Shenzhen 518063,China) (3.Public Experimental Teaching Department,Nanjing University of Aeronautics and Astronautics,Nanjing 211106,China)
关键词:
联邦学习知识蒸馏脑疾病诊断
Keywords:
federated learningknowledge distillationbrain disease diagnosis
分类号:
TP391
DOI:
10.3969/j.issn.1672-1292.2023.01.003
文献标志码:
A
摘要:
多中心疾病诊断方法通过整合不同医疗机构的样本信息到一台服务器上,集中训练来提高预测的准确性,有效解决了医疗领域小样本的问题. 但仍存在两个问题:不同医疗机构的数据分布不同以及无法保护病人的隐私. 基于此,设计了一种应用在多站点脑疾病诊断领域中隐私保护的联邦知识蒸馏算法. 首先,设计了服务器端基于批标准化的加权平均算法,帮助联邦模型提取各个医疗机构数据分布无关的特征. 之后,在客户端设计了联邦教师模型-本地学生模型的框架,部署了本地分类器,利用蒸馏损失保证模型提取本地化特征,利用分类损失保证模型性能稳定. 实验结果表明,该算法在自闭症及精神分裂症数据集上均优于现有的其他算法.
Abstract:
The multi-site disease diagnosis method can improve the accuracy of prediction by integrating the sample information of different medical institutions into one server, which effectively solves the problem of small sample size in the medical field. However, most of these approaches have two problems in the medical field which being the different distribution of data in different medical institutions and the inability to protect patient privacy. Based on these, we design a federal knowledge distillation algorithm for privacy protection in multi-site brain disease diagnosis. Firstly, a weighted average algorithm based on batch standardization is designed on the server to help the federated model to extract the distribution independent feature of each medical institution. Then, the framework of federated teacher model-local student model is designed on the client, and local classifier is deployed. The distillation loss guarantee model is used to extract localized features, and the classification loss is used to ensure the stable performance of the model. Experimental results show that the proposed algorithm is superior to other existing algorithms in autism and schizophrenia datasets.

参考文献/References:

[1]周海榆,张道强. 面向多中心数据的超图卷积神经网络及应用[J]. 计算机科学,2022,49(3):129-133.
[2]MCMAHAN H B,MOORE E,RAMAGE D,et al. Communication-efficient learning of deep networks from decentralized data[C]//Proceedings of the 20th International Conference on Artificial Intelligence and Statistics(AISTATS). Fort Lauderdale,USA:JMLR,2017.
[3]YANG Q,LIU Y,CHEN T,et al. Federated machine learning:concept and applications[J]. ACM Transactions on Intelligent Systems and Technology,2019,10(2):1-19.
[4]YANG Q,LIU Y,CHENG Y,et al. Federated Learning[M]. San Rafael,USA:Morgan & Claypool Publishers,2019.
[5]LI T,SAHU A K,TALWALKAR A,et al. Federated learning:challenges,methods,and future directions[J]. IEEE Signal Processing Magazine,2020,37(3):50-60.
[6]HINTON G,VINYALS O,DEAN J. Distilling the knowledge in a neural network[J]. Computer Science,2015,14(7):38-39.
[7]VIELZEUF V,LECHERVY A,PATEUX S,et al. Towards a general model of knowledge for facial analysis by multi-source transfer learning[J]. arXiv Preprint arXiv:1911.03222,2019.
[8]WANG J,BAO W D,SUN L C,et al. Private model compression via knowledge distillation[J]. Proceedings of the AAAI Conference on Artificial Intelligence,2019,33(1):1190-1197.
[9]VONGKULBHISAL J,VINAYAVEKHIN P,VISENTINI-SCARZANELLA M. Unifying heterogeneous classifiers with distillation[C]//Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition(CVPR). Long Beach,USA:IEEE,2019.
[10]SHELLER M J,REINA G A,EDWARDS B,et al. Multi-institutional deep learning modeling without sharing patient data:a feasibility study on brain tumor segmentation[C]//Proceedings of the 4th International MICCAI Brainlesion Workshop. Granada,Spain:Springer,2018.
[11]ZHANG W S,ZHOU T,LU Q H,et al. Dynamic fusion-based federated learning for COVID-19 detection[J]. IEEE Internet of Things Journal,2021,8(21):15884-15891.
[12]MA X,ZHU J,LIN Z,et al. A state-of-the-art survey on solving Non-IID data in federated learning[J]. Future Generation Computer Systems,2022,135:244-258.
[13]JEONG E,OH S,KIM H,et al. Communication-efficient on-device machine learning:federated distillation and augmentation under Non-IID private data[J]. arXiv Preprint arXiv:1811.11479,2018.
[14]JIANG D L,SHAN C,ZHANG Z H. Federated learning algorithm based on knowledge distillation[C]//Proceedings of the 2020 International Conference on Artificial Intelligence and Computer Engineering(ICAICE). Beijing,China:IEEE,2020.
[15]CHA H,PARK J,KIM H,et al. Proxy experience replay:federated distillation for distributed reinforcement learning[J]. IEEE Intelligent Systems,2020,35(4):94-101.
[16]ITAHARA S,NISHIO T,KODA Y,et al. Distillation-based semi-supervised federated learning for communication-efficient collaborative training with Non-IID private data[J]. arXiv Preprint arXiv:2008.06180,2020.
[17]MARTINO A D,YAN C G,LI Q,et al. The autism brain imaging data exchange:towards a large-scale evaluation of the intrinsic brain architecture in autism[J]. Molecular Psychiatry,2014,19(6):659-667.
[18]LI X X,JIANG M R,ZHANG X F,et al. FedBN:federated learning on Non-IID features via local batch normalization[J]. arXiv Preprint arXiv:2102.07623,2021.
[19]LI T,SAHU A K,ZAHEER M,et al. Federated optimization in heterogeneous networks[J]. avXiv Preprint arXiv:1812.06127,2020.

相似文献/References:

[1]孔秀平,陆 林.隐私保护下的车辆轨迹联邦嵌入学习与聚类[J].南京师范大学学报(工程技术版),2022,22(02):080.[doi:10.3969/j.issn.1672-1292.2022.02.012]
 Kong Xiuping,Lu Lin.Privacy-preserved Vehicular Trajectory Embedding Federated Learning and Clustering[J].Journal of Nanjing Normal University(Engineering and Technology),2022,22(01):080.[doi:10.3969/j.issn.1672-1292.2022.02.012]

备注/Memo

备注/Memo:
收稿日期:2022-09-15.
通讯作者:朱旗,博士,副教授,研究方向:机器学习、模式识别、脑疾病诊断. E-mail:zhuqinuaa@163.com
更新日期/Last Update: 2023-03-15