当代外语研究 ›› 2022, Vol. 22 ›› Issue (4): 98-110.doi: 10.3969/j.issn.1674-8921.2022.04.010

• 思想与学术 • 上一篇    下一篇

自然语言处理中的神经网络模型

冯志伟(), 丁晓梅()   

  1. 新疆大学,乌鲁木齐,830000
    大连海事大学,大连,116026
  • 出版日期:2022-08-28 发布日期:2022-09-13
  • 作者简介:冯志伟,教授,新疆大学天山学者。主要研究方向为自然语言处理、计算语言学、数理语言学。电子邮箱: zwfengde2010@163.com|丁晓梅,大连海事大学外国语学院副教授。主要研究方向为俄罗斯语言学、语料库语言学。电子邮箱: wyxydxm@dlmu.edu.com
  • 基金资助:
    * 国家社会科学基金项目“基于平行语料库的俄汉语言学术语词典编纂研究”的阶段性成果(编号17BYY220)

Neural Network Models in Natural Language Processing

FENG Zhiwei(), DING Xiaomei()   

  • Online:2022-08-28 Published:2022-09-13

摘要:

自然语言处理是用计算机来研究和处理自然语言的一门交叉学科,近年来发展迅速,引起语言学界的极大关注。文章讨论了自然语言处理中的四种神经网络模型,即前馈神经网络模型、卷积神经网络模型、循环神经网络模型和预训练模型,其中包括模型的原理、结构、算法、机制,并突出强调它们在自然语言处理中的应用。文章指出,尽管神经网络模型已经成为自然语言处理的主流,但这些模型还缺乏可解释性,未来需要得到基于规则的语言模型和基于统计的语言模型的支持。

关键词: 自然语言处理, 神经网络模型, 前馈神经网络模型, 卷积神经网络模型, 循环神经 网络模型, 预训练模型

Abstract:

Natural Language Processing (NLP) is a new interdisciplinary subject to study and process the natural language by computer. In recent years, NLP has developed very rapidly and it attracted great attention from the linguistic community. This paper discusses four types of neural network model in natural language processing: Feed-forward Neural Network model (FNN), Convolutional Neural Network model (CNN), Recurrent Neural Network model (RNN), and Pre-Training model (PT), including the basic principle, structure, algorithm, and mechanism of the model, highlights their application in NLP. The paper points out that although the neural network models have become the mainstream of NLP, but these models still lack interpretability and need to be supported by rule-based language models and statistics-based language models in the futur.

Key words: Natural Language Processing, Neural Network model, Feed-forward Neural Network model, Convolutional Neural Network model, Recurrent Neural Network model, Pre-Training model

中图分类号: