Skip to main content
. 2021 Jul 23;21(15):4999. doi: 10.3390/s21154999
Algorithm 1 Federated Averaging. In the cluster there are N clients in total, each with a learning rate of η. The set containing all clients is denoted as S, the communication interval is denoted as E, and the fraction of clients is denoted as C
On Server:
  • 1:

    Initialization: global model w0.

  • 2:

    for each global epoch t1,,epoch do

  • 3:

        # Determine the number of participants.

  • 4:

        mmax(C·N,1)

  • 5:

        # Randomly choose participants.

  • 6:

        Sp=RandomChoice(S,m)

  • 7:

        for all each client kSp do in parallel

  • 8:

            # Get clients updated model.

  • 9:

            wt+1kOnClientUpdate(k,wt)

  • 10:

        end for

  • 11:

        # Update global model.

  • 12:

        wt+1k=0Npkwt+1k

  • 13:

    end for

    OnClientUpdatek,w0:

  • 14:

    for each client epoch do

  • 15:

        # Do local model training on local dataset.

  • 16:

        we+1weηFwe

  • 17:

    end for

  • 18:

    returnwe+1