Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation

Show simple item record

dc.contributor.author Mishra, Rahul
dc.contributor.author Gupta, Hari Prabhat
dc.date.accessioned 2024-02-13T06:49:58Z
dc.date.available 2024-02-13T06:49:58Z
dc.date.issued 2023-07-19
dc.identifier.issn 15361233
dc.identifier.uri http://localhost:8080/xmlui/handle/123456789/2891
dc.description This paper published with affiliation IIT (BHU), Varanasi in Open Access Mode. en_US
dc.description.abstract Automated feature extraction capability and significant performance of Deep Neural Networks (DNN) make them suitable for Internet of Things (IoT) applications. However, deploying DNN on edge devices becomes prohibitive due to the colossal computation, energy, and storage requirements. This paper presents a novel approach, EarlyLight, for designing and training lightweight DNN using large-size DNN. The approach considers the available storage, processing speed, and maximum allowable processing time to execute the task on edge devices. We present a knowledge distillation based training procedure to train the lightweight DNN to achieve adequate accuracy. During the training of lightweight DNN, we introduce a novel early halting technique, which preserves network resources; thus, speedups the training procedure. Finally, we present the empirically and real-world evaluations to verify the effectiveness of the proposed approach under different constraints using various edge devices. en_US
dc.language.iso en en_US
dc.publisher Institute of Electrical and Electronics Engineers Inc. en_US
dc.relation.ispartofseries IEEE Transactions on Mobile Computing;
dc.subject Artificial neural networks en_US
dc.subject Deep neural networks en_US
dc.subject Internet of Things en_US
dc.subject knowledge distillation en_US
dc.subject Knowledge engineering en_US
dc.subject Mobile computing en_US
dc.subject Performance evaluation en_US
dc.subject Task analysis en_US
dc.subject Training en_US
dc.title Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search in IDR


Advanced Search

Browse

My Account