| Algorithm 1. Adaptive Federated Learning Algorithm with Divergence Estimation and Weight Calculation. |
|
Input: Global model parameters ωt Local datasets for each client Di Local learning rate schedule for each client ηi(t) Output: Updated global model parameters ωt+1 //Client-Side: 1: for each client i in parallel do 2: for each epoch e do 3: Train local copy of ωt on Di using ηi(t) 4: Obtain 5: end for 6: Calculate local performance metrics on and Di 7: Estimate divergence between Pi (client’s data distribution) and Pt (global model distribution) using the Kullback–Leibler (KL) divergence (Equation (2)): 8: if divergence is acceptable then 9: Calculate importance weight based on divergence estimation: 10: else 11: Adjust learning rate ηi(t) and retrain local model 12: end if 13: end for //Server-Side: 14: Sample clients with probability proportional to 15: Collect from selected clients 16: Aggregate updates using weighted average (Equation (4)) with as weights: 17: Update the global model using (Equation (5)). 18: if the global model converges then 19: Broadcast ωt+1 to all clients 20: else 21: Repeat the process for the next round 22: end if Updated global model parameters ωt+1 |