TY - JOUR
T1 - Context-aware style learning and content recovery networks for neural style transfer
AU - Wu, Lianwei
AU - Liu, Pusheng
AU - Yuan, Yuheng
AU - Liu, Siying
AU - Zhang, Yanning
N1 - Publisher Copyright:
© 2023 Elsevier Ltd
PY - 2023/5
Y1 - 2023/5
N2 - Neural text transfer aims to change the style of a text sequence while keeping its original content. Due to the lack of parallel data, unsupervised learning-based approaches have gained considerable development. However, there are still several problems in these approaches: (1) The generated transferred sequences sometimes have inconsistencies between the transferred style and content, and (2) It is difficult to ensure sufficient preservation of the core semantics of original sequences in the transferred sequences. To address these defects, we propose Context-aware Style Learning and Content Recovery networks (CSLCR) for neural text transfer. Specifically, to improve the consistency between the transferred style and content, the designed context-aware style learning layer (CSL) retrieves target style samples with similar semantics to the original sequence, and promotes deep interactive fusion with the original sequence, so as to generate transferred sequence with context-aware style. To tackle the second problem, we explore content constraint recovery layer (CCR) from an indirect perspective, which decodes and recovers the core content semantics of the original sequence and the transferred sequence by both recovery decoding layers, respectively, and intensifies the preservation of the core semantics of both the sequences by a multi-level constraint mechanism. Experiments on two public datasets demonstrate the superiority of our proposed method.
AB - Neural text transfer aims to change the style of a text sequence while keeping its original content. Due to the lack of parallel data, unsupervised learning-based approaches have gained considerable development. However, there are still several problems in these approaches: (1) The generated transferred sequences sometimes have inconsistencies between the transferred style and content, and (2) It is difficult to ensure sufficient preservation of the core semantics of original sequences in the transferred sequences. To address these defects, we propose Context-aware Style Learning and Content Recovery networks (CSLCR) for neural text transfer. Specifically, to improve the consistency between the transferred style and content, the designed context-aware style learning layer (CSL) retrieves target style samples with similar semantics to the original sequence, and promotes deep interactive fusion with the original sequence, so as to generate transferred sequence with context-aware style. To tackle the second problem, we explore content constraint recovery layer (CCR) from an indirect perspective, which decodes and recovers the core content semantics of the original sequence and the transferred sequence by both recovery decoding layers, respectively, and intensifies the preservation of the core semantics of both the sequences by a multi-level constraint mechanism. Experiments on two public datasets demonstrate the superiority of our proposed method.
KW - Consistency inference
KW - Context-aware style learning
KW - Core content recovery
KW - Natural language processing
KW - Neural text style transfer
UR - http://www.scopus.com/inward/record.url?scp=85147255630&partnerID=8YFLogxK
U2 - 10.1016/j.ipm.2023.103265
DO - 10.1016/j.ipm.2023.103265
M3 - 文章
AN - SCOPUS:85147255630
SN - 0306-4573
VL - 60
JO - Information Processing and Management
JF - Information Processing and Management
IS - 3
M1 - 103265
ER -