Optimizing Model Utility in Differentially Private Federated Learning by Tuning Exposure Times

发布者:信息科学与技术学院发布时间:2023-11-03浏览次数:84

题目:Optimizing Model Utility in Differentially Private Federated Learning by Tuning Exposure Times

报告人:Yipeng Zhou

点:2号学院楼五楼教工小家

时间:2023年11月7日10:00


报告人简介

Dr Yipeng Zhou is a senior lecturer with School of Computing, Faculty of Science and Engineering at Macquarie University, Australia. He is the recipient of ARC Discover Early Career Researcher Award (DECRA) in 2018. He got his Ph.D. degree from Information Engineering Department of CUHK and Bachelor degree from Department of Computer Science and Technology of University of Science and Technology of China (USTC), respectively. His research interests lie in federated learning and privacy-preservation. He has published more than 110 papers including 34 CCF A ranked papers such as IEEE INFOCOM, IJCAI, IEEE ToN, TDSC, JSAC, TPDS, TMC, TMM, etc.

周义朋博士是澳大利亚麦考瑞大学科学与工程学院计算机系的高级讲师。他是2018年ARC 发现早期职业生涯研究员奖 (DECRA) 的获得者。他分别获得香港中文大学信息工程系博士学位和中国科学技术大学计算机科学与技术系学士学位。他的研究兴趣在于联邦学习和隐私保护。发表论文110余篇,其中IEEE INFOCOM、IJCAI、IEEE ToN、TDSC、JSAC、TPDS、TMC、TMM等CCF A类论文34篇。


报告摘要

Federated learning (FL) empowers distributed clients to collaboratively train a shared machine learning model through exchanging parameter information. Despite the fact that FL can protect clients' raw data, malicious users can still crack original data with disclosed parameters. To amend this flaw, differential privacy (DP) is incorporated into FL clients to disturb original parameters, which however can significantly impair model utility. This talk will focus on the study how to minimize the influence of DP noises on model utility by optimizing critical hyperparameters in DP enhanced FL. We investigate how these hyperparameters including the number of participants per round and the number of conducted global iterations affect the scale of DP noises, and hence the influence on model utility. This talke will explore how these hyperparameters can be optimally determined. In the end, the future work in this topic will be envisioned.