[Zurück]


Beiträge in Tagungsbänden:

B. Haas, A. Wendt, A. Jantsch, M. Wess:
"Neural Network Compression Through Shunt Connections and Knowledge Distillation for Semantic Segmentation Problems";
in: "Artificial Intelligence Applications and Innovations, 17th IFIP WG 12.5 International Conference", herausgegeben von: Springer Nature; Springer Nature Switzerland AG, Greece, 2021, ISBN: 978-3-030-79149-0, S. 349 - 361.



Kurzfassung englisch:
Employing convolutional neural network models for large
scale datasets represents a big challenge. Especially embedded devices
with limited resources cannot run most state-of-the-art model architectures
in real-time, necessary for many applications. This paper proves
the applicability of shunt connections on large scale datasets and narrows
this computational gap. Shunt connections is a proposed method
for MobileNet compression. We are the first to provide results of shunt
connections for the MobileNetV3 model and for segmentation tasks on
the Cityscapes dataset, using the DeeplabV3 architecture, on which we
achieve compression by 28%, while observing a 3.52 drop in mIoU. The
training of shunt-inserted models are optimized through knowledge distillation.
The full code used for this work will be available online.

Schlagworte:
Shunt connections, Knowledge distillation, Optimization, Latency, Accuracy, CIFAR, Cityscapes, DeepLab, MobileNet, Machine learning, Embedded machine learning


"Offizielle" elektronische Version der Publikation (entsprechend ihrem Digital Object Identifier - DOI)
http://dx.doi.org/10.1007/978-3-030-79150-6

Elektronische Version der Publikation:
https://publik.tuwien.ac.at/files/publik_296405.pdf


Erstellt aus der Publikationsdatenbank der Technischen Universität Wien.