Title | ||
---|---|---|
Initializing Deep Learning Based on Latent Dirichlet Allocation for Document Classification. |
Abstract | ||
---|---|---|
The gradient-descent learning of deep neural networks is subject to local minima, and good initialization may depend on the tasks. In contrast, for document classification tasks, latent Dirichlet allocation (LDA) was quite successful in extracting topic representations, but its performance was limited by its shallow architecture. In this study, LDA was adopted for efficient layer-by-layer pre-training of deep neural networks for a document classification task. Two-layer feedforward networks were added at the end of the process, and trained using a supervised learning algorithm. With 10 different random initializations, the LDA-based initialization generated a much lower mean and standard deviation for false recognition rates than other state-of-the-art initialization methods. This might demonstrate that the multi-layer expansion of probabilistic generative LDA model is capable of extracting efficient hierarchical topic representations for document classification. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1007/978-3-319-46675-0_70 | Lecture Notes in Computer Science |
Keywords | Field | DocType |
Document classification,Deep learning,Latent dirichlet allocation,Good initialization | Document classification,Latent Dirichlet allocation,Pattern recognition,Computer science,Maxima and minima,Artificial intelligence,Initialization,Probabilistic logic,Deep learning,Standard deviation,Machine learning,Feed forward | Conference |
Volume | ISSN | Citations |
9949 | 0302-9743 | 0 |
PageRank | References | Authors |
0.34 | 5 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hyung-Bae Jeon | 1 | 2 | 3.10 |
Soo-Young Lee | 2 | 1137 | 163.87 |