TY - JOUR
T1 - FDE-Net
T2 - A memory-efficiency densely connected network inspired from fractional-order differential equations for single image super-resolution
AU - Zhang, Xiao
AU - Zhang, Lei
AU - Wei, Wei
AU - Sun, Yuxuan
AU - Tian, Chunna
AU - Zhang, Yanning
N1 - Publisher Copyright:
© 2024 Elsevier B.V.
PY - 2024/10/1
Y1 - 2024/10/1
N2 - Dense connection has proved an effective way to take full advantage of hierarchical features extracted from low-resolution images using deep neural networks (DNNs) for single image super-resolution (SISR). However, existing densely connected networks are often manually designed and overly depended on practical experience, thus leading to suboptimal performance. Moreover, due to their complicated connections, they are memory-consuming and lack of interpretability. To address all these problems with one stone, we propose to construct a new densely connected network for SISR from a dynamic system perspective. Following this idea, we cast the hierarchical transformation of DNNs into the state evolution of a fractional-order dynamic system, which empowers us to automatically construct two interdependent densely connected modules based on the system solution rather manual design. They are a prediction module that controls the system to iteratively predict the next state, and a correction module that iteratively refines the predicted state to improve the prediction accuracy. With these two modules as backbone, we establish a Fractional-order Differential Equations-based network (FDE-Net) for SISR. Since the iterative computational manner requires both densely connected modules to be of the recurrent structure, FDE-Net is memory-efficiency and good interpretability. In addition, we analyze the existence and uniqueness of the solution for FDE to theoretically guarantee the feasibility of FDE-Net. Experiments on four SISR benchmark datasets demonstrate the superiority of FDE-Net over existing densely connected networks and other baselines in terms of generalization capacity, especially with limited memory.
AB - Dense connection has proved an effective way to take full advantage of hierarchical features extracted from low-resolution images using deep neural networks (DNNs) for single image super-resolution (SISR). However, existing densely connected networks are often manually designed and overly depended on practical experience, thus leading to suboptimal performance. Moreover, due to their complicated connections, they are memory-consuming and lack of interpretability. To address all these problems with one stone, we propose to construct a new densely connected network for SISR from a dynamic system perspective. Following this idea, we cast the hierarchical transformation of DNNs into the state evolution of a fractional-order dynamic system, which empowers us to automatically construct two interdependent densely connected modules based on the system solution rather manual design. They are a prediction module that controls the system to iteratively predict the next state, and a correction module that iteratively refines the predicted state to improve the prediction accuracy. With these two modules as backbone, we establish a Fractional-order Differential Equations-based network (FDE-Net) for SISR. Since the iterative computational manner requires both densely connected modules to be of the recurrent structure, FDE-Net is memory-efficiency and good interpretability. In addition, we analyze the existence and uniqueness of the solution for FDE to theoretically guarantee the feasibility of FDE-Net. Experiments on four SISR benchmark datasets demonstrate the superiority of FDE-Net over existing densely connected networks and other baselines in terms of generalization capacity, especially with limited memory.
KW - Dense connection
KW - DNNs
KW - Fractional-order dynamic system
UR - http://www.scopus.com/inward/record.url?scp=85197782999&partnerID=8YFLogxK
U2 - 10.1016/j.neucom.2024.128143
DO - 10.1016/j.neucom.2024.128143
M3 - 文章
AN - SCOPUS:85197782999
SN - 0925-2312
VL - 600
JO - Neurocomputing
JF - Neurocomputing
M1 - 128143
ER -