Abstract
Recently, deep learning models exhibit promising performance in various applications. However, most of them converge slowly due to gradient vanishing. To address this problem, we propose a fast convergent fully connected deep learning network in this study. Through constraining the input values of nodes on the fully connected layers, the proposed method is able to well mitigate the gradient vanishing problems in training phase, and thus greatly reduces the training iterations required to reach convergence. Nevertheless, the drop of generalization performance is negligible. Experimental results validate the effectiveness of the proposed method.
| Original language | English |
|---|---|
| Pages (from-to) | 995-1005 |
| Number of pages | 11 |
| Journal | Neural Processing Letters |
| Volume | 49 |
| Issue number | 3 |
| DOIs | |
| State | Published - 15 Jun 2019 |
Keywords
- Constrained input value of nodes
- Deep learning model
- Fast convergent method
Fingerprint
Dive into the research topics of 'Fast-Convergent Fully Connected Deep Learning Model Using Constrained Nodes Input'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver