| Federated learning algorithm consists of a central server and multiple clients that can share important information without the need to exchange local data,but there is a high communication overhead due to the need for constant communication between the client and the server to exchange a large number of model parameters.Therefore,it is of great practical significance to study how to reduce the cost of federated learning communication,which is of great practical significance for the practical application of federated learning,especially on mobile devices under communication constraints.The communication overhead mainly includes two aspects: one is the total amount of communication data,and the other is the number of communication rounds.Firstly,in order to reduce the total number of bits required to transmit data,this paper proposes a federal average algorithm based on weight quantization-Federated Averaging Algorithm Based On Dynamic Quantization(DQFed Avg).A quantization operation is performed before the client uploads the updated local weights to the central server,reducing the number of communication bits in a single round of communication,thereby reducing communication overhead.Secondly,in order to reduce the number of communication rounds between the client and the server,this paper proposes an Coupling-acceleration Two-way Lazily Aggregated Quantized-gradient(CT-LAQ).For uplinks,the client calculates the gradient by coupling acceleration gradient descent,quantizes the gradient before uploading,and then sets a standard to determine whether to upload the current gradient or skip the current round of upload and reuse the previous gradient.For downlinks,the server updates the global gradient through aggregation,quantizes the run before the broadcast,and selects to broadcast the current gradient or skip the current round of broadcast to reuse the previous gradient.In this way,by choosing to skip upload or broadcast,the number of communication rounds is greatly reduced,thereby saving communication costs.While fully ensuring the data privacy and security of the client,the federated learning algorithm proposed in this paper greatly reduces the total amount of communication data and the number of communication rounds between the client and the server by adding quantitative technology and coupling acceleration gradient descent method,accelerating model convergence,improving training efficiency,and thereby reducing system communication costs.A large number of simulation experiments have verified the effectiveness of the proposed algorithm. |