Talks and presentations

Modelling Connectivity: An Alternative Approach to Neural Network Compression

March 01, 2013

Talk, University of Maine (Virtual), Orono ME, USA

We all know that larger and deeper neural network models are the modus operandi in tackling real-world problems. However, the focus on large capacity DNNs runs counter to the requirements for their hardware implementation. Neural network compression-via-pruning has emerged as a popular approach that can help bridge the gap between exorbitantly large theoretical models and their slimmer hardware counterparts, while maintaining a desired level of performance. Most approaches to neural network pruning focus on using deterministic constraints on the learned weight matrices, either by evaluating a filter’s importance using appropriate norms or modifying the objective function with sparsity constraints. While they offer a useful way to approximate contributions from filters, they either ignore the dependency between layers or solve a needlessly more difficult optimization objective. In this talk, I propose an alternative approach to neural network pruning, using the power of Conditional Mutual Information (CMI) under a probabilistic framework. In this work, I use CMI as a measure of connectivity between filters of adjacent layers across the entire DNN which can then be used to prune filters that offer lesser information to subsequent layers. Further expanding on this, I show how we can leverage ideas from the original weight-based approaches and our newly proposed probabilistic framework to offer a hybrid solution to pruning that is extremely effective.