基于预训练机制的自修正复杂语义分析方法

Translated title of the contribution: Self-correcting complex semantic analysis method based on pre-training mechanism

Qing Li, Jiang Zhong, Lili Li, Qi Li

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

In the process of knowledge service, in order to meet the fragmentation management needs of intellectualization, knowledge ability, refinement and reorganization content resources. Through deep analysis and mining of semantic hidden knowledge, technology, experience, and information, it broke through the existing bottleneck of traditional semantic parsing technology from Text-to-SQL. The PT-Sem2SQL based on the pre-training mechanism was proposed. The MT-DNN pre-training model mechanism combining Kullback-Leibler technology was designed to enhance the depth of context semantic understanding. A proprietary enhancement module was designed that captured the location of contextual semantic information within the sentence. Optimize the execution process of the generated model by the self-correcting method to solve the error output during decoding. The experimental results show that PT-Sem2SQL can effectively improve the parsing performance of complex semantics, and its accuracy is better than related work.

Translated title of the contributionSelf-correcting complex semantic analysis method based on pre-training mechanism
Original languageChinese (Traditional)
Pages (from-to)41-50
Number of pages10
JournalTongxin Xuebao/Journal on Communications
Volume40
Issue number12
DOIs
StatePublished - 25 Dec 2019
Externally publishedYes

Fingerprint

Dive into the research topics of 'Self-correcting complex semantic analysis method based on pre-training mechanism'. Together they form a unique fingerprint.

Cite this