Alex barros - Profile on Academia.edu (original) (raw)

Alex barros

Sixing Yu related author profile picture

Chunming Rong related author profile picture

Samuel Horvath related author profile picture

Aniruddha Bardhan related author profile picture

Avinash Chakravarthi related author profile picture

Hunmin Lee related author profile picture

morteza homayounfar related author profile picture

Marios Fournarakis related author profile picture

Andrea Rizzardi related author profile picture

Pedro Leobardo Jiménez Sánchez related author profile picture

Uploads

Papers by Alex barros

Research paper thumbnail of A strategy to the reduction of communication overhead and overfitting in Federated Learning

Anais do XXVI Workshop de Gerência e Operação de Redes e Serviços (WGRS 2021), 2021

Federated learning (FL) is a framework to train machine learning models using decentralized data,... more Federated learning (FL) is a framework to train machine learning models using decentralized data, especially unbalanced and non-iid. Adaptive methods can be used to accelerate convergence, reducing the number of rounds of local computation and communication to a centralized server. This paper proposes an adaptive controller to adapt the number of epochs needed that employs Poisson distribution to avoid overfitting of the aggregated model, promoting fast convergence. Our results indicate that increasing the local update of the model should be avoided, but yet some complementary mechanism is needed to model performance. We evaluate the impact of an increasing number of epochs of FedAVG and FedADAM.

Research paper thumbnail of A strategy to the reduction of communication overhead and overfitting in Federated Learning

Anais do XXVI Workshop de Gerência e Operação de Redes e Serviços (WGRS 2021), 2021

Federated learning (FL) is a framework to train machine learning models using decentralized data,... more Federated learning (FL) is a framework to train machine learning models using decentralized data, especially unbalanced and non-iid. Adaptive methods can be used to accelerate convergence, reducing the number of rounds of local computation and communication to a centralized server. This paper proposes an adaptive controller to adapt the number of epochs needed that employs Poisson distribution to avoid overfitting of the aggregated model, promoting fast convergence. Our results indicate that increasing the local update of the model should be avoided, but yet some complementary mechanism is needed to model performance. We evaluate the impact of an increasing number of epochs of FedAVG and FedADAM.

Log In