Pruning neural networks for wiring length efficiency is considered. Three techniques are proposed and experimentally tested: distance-based regularization, nested-rank pruning, and layer-by-layer bipartite matching. The first two algorithms are used in the training and pruning phases, respectively, and the third is used in the arranging neurons phase. Experiments show that distance-based regularization with weight based pruning tends to perform the best, with or without layer-by-layer bipartite matching. These results suggest that these techniques may be useful in creating neural networks for implementation in widely deployed specialized circuits.
Related Research
-
Borealis AI at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
Borealis AI at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
Learning And Generalization; Natural Language Processing; Time series Modelling
Research
-
Few-Shot Learning & Meta-Learning | Tutorial
Few-Shot Learning & Meta-Learning | Tutorial
W. Zi, L. S. Ghoraie, and S. Prince.
Research
-
Self-supervised Learning in Time-Series Forecasting — A Contrastive Learning Approach
Self-supervised Learning in Time-Series Forecasting — A Contrastive Learning Approach
T. Sylvain, L. Meng, and A. Lehrmann.
Research