Local Learning Rules for Nonnegative Tucker Decomposition
- Anh Huy Phan,
- Andrzej Cichocki
Analysis of data with high dimensionality in modern applications, such as spectral analysis, neuroscience, chemometrices naturally requires tensorial approaches different from standard matrix factorizations (PCA, ICA, NMF). The Tucker decomposition and its constrained versions with sparsity and/or nonnegativity constraints allow for the extraction of different numbers of hidden factors in each of the modes, and permits interactions within each modality having many potential applications in computational neuroscience, text mining, and data analysis. In this paper, we propose a new algorithm for Nonnegative Tucker Decomposition (NTD) based on a constrained minimization of a set of local cost functions which is suitable for large scale problems. Extensive experiments confirm the validity and high performance of the developed algorithms in comparison with other well-known algorithms.
- Record ID
- Leung Chi Sing, Chi Sing Leung Lee Minho, Minho Lee Chan Jonathan H. Jonathan H. Chan (eds.): Neural Information Processing, Lecture Notes In Computer Science, no. 5863, 2009, Springer Berlin Heidelberg, ISBN 978-3-642-10676-7, 978-3-642-10677-4
- Keywords in English
- Artificial Intelligence (incl. Robotics), Computation by Abstract Devices, Image Processing and Computer Vision, Information Systems Applications (incl.Internet), pattern recognition, Simulation and Modeling
- http://link.springer.com/chapter/10.1007/978-3-642-10677-4_61 Opening in a new tab
- Score (nominal)
- Publication indicators
- = 12
- Citation count
- Uniform Resource Identifier
* presented citation count is obtained through Internet information analysis and it is close to the number calculated by the Publish or PerishOpening in a new tab system.