Pruning neural networks for wiring length efficiency is considered. Three techniques are proposed and experimentally tested: distance-based regularization, nested-rank pruning, and layer-by-layer bipartite matching. The first two algorithms are used in the training and pruning phases, respectively, and the third is used in the arranging neurons phase. Experiments show that distance-based regularization with weight based pruning tends to perform the best, with or without layer-by-layer bipartite matching. These results suggest that these techniques may be useful in creating neural networks for implementation in widely deployed specialized circuits.
Related Research
-
Self-supervised Learning in Time-Series Forecasting — A Contrastive Learning Approach
Self-supervised Learning in Time-Series Forecasting — A Contrastive Learning Approach
T. Sylvain, L. Meng, and A. Lehrmann.
Research
-
Efficient CDF Approximations for Normalizing Flows
Efficient CDF Approximations for Normalizing Flows
C.S. Sastry, A. Lehrmann, M. Brubaker, and A. Radovic. Transactions on Machine Learning Research (TMLR)
Publications
-
PUMA: Performance Unchanged Model Augmentation for Training Data Removal
PUMA: Performance Unchanged Model Augmentation for Training Data Removal
G. Wu, M. Hashemi, and C. Srinivasa. Association for the Advancement of Artificial Intelligence (AAAI)
Publications