Sc2Net: Sparse LSTMs for sparse coding <sup>∗</sup>

Publication Type:
Conference Proceeding
Citation:
32nd AAAI Conference on Artificial Intelligence, AAAI 2018, 2018, pp. 4588 - 4595
Issue Date:
2018-01-01
Filename Description Size
16822-77539-1-PB.pdfPublished version1.08 MB
Adobe PDF
Full metadata record
Copyright © 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. The iterative hard-thresholding algorithm (ISTA) is one of the most popular optimization solvers to achieve sparse codes. However, ISTA suffers from following problems: 1) ISTA employs non-adaptive updating strategy to learn the parameters on each dimension with a fixed learning rate. Such a strategy may lead to inferior performance due to the scarcity of diversity; 2) ISTA does not incorporate the historical information into the updating rules, and the historical information has been proven helpful to speed up the convergence. To address these challenging issues, we propose a novel formulation of ISTA (named as adaptive ISTA) by introducing a novel adaptive momentum vector. To efficiently solve the proposed adaptive ISTA, we recast it as a recurrent neural network unit and show its connection with the well-known long short term memory (LSTM) model. With a new proposed unit, we present a neural network (termed SC2Net) to achieve sparse codes in an end-to-end manner. To the best of our knowledge, this is one of the first works to bridge the 1-solver and LSTM, and may provide novel insights in understanding model-based optimization and LSTM. Extensive experiments show the effectiveness of our method on both unsupervised and supervised tasks.
Please use this identifier to cite or link to this item: