Skip to main content
. 2022 Sep 18;22(18):7061. doi: 10.3390/s22187061
Algorithm 1 System model of FL
Input: Set of connected clients C , E is the number of global iteration , K is the number of local epoch, η is the learning rate , b is the size of local mini batch , r is rate of client selection
Output: Global model W
1: procedure Server(E,K) ▹ Central Server execute
2:   W0,kInitialization ▹ Initialize global model, constant K
3:   for global iteration tE do
4:    Nmax(|C|×r,1)
5:    St ClientSelection (C, N) ▹ Select client for train
6:    for each client cxSt in parallel do
7:     Broadcast Wt,K to client cx
8:     Wtx ClientUpdate (cx,K,Wt) ▹ aggregate model
9:    end for
10:    Update global model Wt+1x=1|St||Dx|DtWtx
11   end for
12: end procedure
13:
14:
15: procedure clientupdate(cx,K,Wt)
16:   B← split local data Dx into batches of size b
17:   Replace local model wtxWt
18:   for local epoch eK do
19:    for batch data bB do
20:     wt,exwt,exηFx(wt,ex,b) ▹ mini-batch SGD
21:    end for
22:   end for
23:   Upload local update result Wtxwt,Kx
24: end procedure