• Open Access
    • Comment cela fonctionne?
    • Ouvrir une session
    • Contact

    Voir le document

    JavaScript is disabled for your browser. Some features of this site may not work without it.
    Voir le document 
    • Accueil de LUCK
    • HE en Hainaut
    • ESTISIM
    • Sciences Et Techniques
    • Voir le document
    • Accueil de LUCK
    • HE en Hainaut
    • ESTISIM
    • Sciences Et Techniques
    • Voir le document
    Voir/Ouvrir
    electronics-11-01525-v3 (1).pdf (631.5Ko)
    Date
    2022
    Auteur
    LERAT, Jean-Sébastien
    Mahmoudi, Sidi Ahmed
    Mahmoudi, Saïd
    Metadata
    Afficher la notice complète
    Partage ça

    Distributed Deep Learning: From Single-Node to Multi-Node Architecture

    Résumé
    During the last years, deep learning (DL) models have been used in several applications with large datasets and complex models. These applications require methods to train models faster, such as distributed deep learning (DDL). This paper proposes an empirical approach aiming to measure the speedup of DDL achieved by using different parallelism strategies on the nodes. Local parallelism is considered quite important in the design of a time-performing multi-node architecture because DDL depends on the time required by all the nodes. The impact of computational resources (CPU and GPU) is also discussed since the GPU is known to speed up computations. Experimental results show that the local parallelism impacts the global speedup of the DDL depending on the neural model complexity and the size of the dataset. Moreover, our approach achieves a better speedup than Horovod.

    Parcourir

    Tout LUCKCommunautés & CollectionsAuteurTitreDate de publicationSujetType de documentTitre de périodiqueThématiqueCette collectionAuteurTitreDate de publicationSujetType de documentTitre de périodiqueThématique

    Mon compte

    Ouvrir une sessionS'inscrire

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Plan du site

    • Open Access
    • Comment cela fonctionne?
    • Mon compte

    Contact

    • L’équipe de LUCK
    • Synhera
    • CIC