|Table of Contents|

Multi-Site Brain Disease Diagnosis Method Based on Federal Knowledge Distillation(PDF)

南京师范大学学报(工程技术版)[ISSN:1006-6977/CN:61-1281/TN]

Issue:
2023年01期
Page:
18-24
Research Field:
计算机科学与技术
Publishing date:

Info

Title:
Multi-Site Brain Disease Diagnosis Method Based on Federal Knowledge Distillation
Author(s):
Yang Qiming1Zhu Qi1Wang Mingming1Sun Kai2Zhu Min3Shao Wei1Zhang Daoqiang1
(1.College of Computer Science and Technology,Nanjing University of Aeronautics and Astronautics,Nanjing 211106,China) (2.Shenzhen Huasai Ruifei Intelligent Technology Co.,Ltd.,Shenzhen 518063,China) (3.Public Experimental Teaching Department,Nanjing University of Aeronautics and Astronautics,Nanjing 211106,China)
Keywords:
federated learningknowledge distillationbrain disease diagnosis
PACS:
TP391
DOI:
10.3969/j.issn.1672-1292.2023.01.003
Abstract:
The multi-site disease diagnosis method can improve the accuracy of prediction by integrating the sample information of different medical institutions into one server, which effectively solves the problem of small sample size in the medical field. However, most of these approaches have two problems in the medical field which being the different distribution of data in different medical institutions and the inability to protect patient privacy. Based on these, we design a federal knowledge distillation algorithm for privacy protection in multi-site brain disease diagnosis. Firstly, a weighted average algorithm based on batch standardization is designed on the server to help the federated model to extract the distribution independent feature of each medical institution. Then, the framework of federated teacher model-local student model is designed on the client, and local classifier is deployed. The distillation loss guarantee model is used to extract localized features, and the classification loss is used to ensure the stable performance of the model. Experimental results show that the proposed algorithm is superior to other existing algorithms in autism and schizophrenia datasets.

References:

[1]周海榆,张道强. 面向多中心数据的超图卷积神经网络及应用[J]. 计算机科学,2022,49(3):129-133.
[2]MCMAHAN H B,MOORE E,RAMAGE D,et al. Communication-efficient learning of deep networks from decentralized data[C]//Proceedings of the 20th International Conference on Artificial Intelligence and Statistics(AISTATS). Fort Lauderdale,USA:JMLR,2017.
[3]YANG Q,LIU Y,CHEN T,et al. Federated machine learning:concept and applications[J]. ACM Transactions on Intelligent Systems and Technology,2019,10(2):1-19.
[4]YANG Q,LIU Y,CHENG Y,et al. Federated Learning[M]. San Rafael,USA:Morgan & Claypool Publishers,2019.
[5]LI T,SAHU A K,TALWALKAR A,et al. Federated learning:challenges,methods,and future directions[J]. IEEE Signal Processing Magazine,2020,37(3):50-60.
[6]HINTON G,VINYALS O,DEAN J. Distilling the knowledge in a neural network[J]. Computer Science,2015,14(7):38-39.
[7]VIELZEUF V,LECHERVY A,PATEUX S,et al. Towards a general model of knowledge for facial analysis by multi-source transfer learning[J]. arXiv Preprint arXiv:1911.03222,2019.
[8]WANG J,BAO W D,SUN L C,et al. Private model compression via knowledge distillation[J]. Proceedings of the AAAI Conference on Artificial Intelligence,2019,33(1):1190-1197.
[9]VONGKULBHISAL J,VINAYAVEKHIN P,VISENTINI-SCARZANELLA M. Unifying heterogeneous classifiers with distillation[C]//Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition(CVPR). Long Beach,USA:IEEE,2019.
[10]SHELLER M J,REINA G A,EDWARDS B,et al. Multi-institutional deep learning modeling without sharing patient data:a feasibility study on brain tumor segmentation[C]//Proceedings of the 4th International MICCAI Brainlesion Workshop. Granada,Spain:Springer,2018.
[11]ZHANG W S,ZHOU T,LU Q H,et al. Dynamic fusion-based federated learning for COVID-19 detection[J]. IEEE Internet of Things Journal,2021,8(21):15884-15891.
[12]MA X,ZHU J,LIN Z,et al. A state-of-the-art survey on solving Non-IID data in federated learning[J]. Future Generation Computer Systems,2022,135:244-258.
[13]JEONG E,OH S,KIM H,et al. Communication-efficient on-device machine learning:federated distillation and augmentation under Non-IID private data[J]. arXiv Preprint arXiv:1811.11479,2018.
[14]JIANG D L,SHAN C,ZHANG Z H. Federated learning algorithm based on knowledge distillation[C]//Proceedings of the 2020 International Conference on Artificial Intelligence and Computer Engineering(ICAICE). Beijing,China:IEEE,2020.
[15]CHA H,PARK J,KIM H,et al. Proxy experience replay:federated distillation for distributed reinforcement learning[J]. IEEE Intelligent Systems,2020,35(4):94-101.
[16]ITAHARA S,NISHIO T,KODA Y,et al. Distillation-based semi-supervised federated learning for communication-efficient collaborative training with Non-IID private data[J]. arXiv Preprint arXiv:2008.06180,2020.
[17]MARTINO A D,YAN C G,LI Q,et al. The autism brain imaging data exchange:towards a large-scale evaluation of the intrinsic brain architecture in autism[J]. Molecular Psychiatry,2014,19(6):659-667.
[18]LI X X,JIANG M R,ZHANG X F,et al. FedBN:federated learning on Non-IID features via local batch normalization[J]. arXiv Preprint arXiv:2102.07623,2021.
[19]LI T,SAHU A K,ZAHEER M,et al. Federated optimization in heterogeneous networks[J]. avXiv Preprint arXiv:1812.06127,2020.

Memo

Memo:
-
Last Update: 2023-03-15