模式识别与人工智能
模式識彆與人工智能
모식식별여인공지능
Moshi Shibie yu Rengong Zhineng
2015年
4期
299-305
,共7页
语音识别%语言模型%循环神经网络%词向量
語音識彆%語言模型%循環神經網絡%詞嚮量
어음식별%어언모형%순배신경망락%사향량
Speech Recognition%Language Model%Recurrent Neural Network%Word Vector
循环神经网络语言模型能解决传统N-gram模型中存在的数据稀疏和维数灾难问题,但仍缺乏对长距离信息的描述能力。为此文中提出一种基于词向量特征的循环神经网络语言模型改进方法。该方法在输入层中增加特征层,改进模型结构。在模型训练时,通过特征层加入上下文词向量,增强网络对长距离信息约束的学习能力。实验表明,文中方法能有效提高语言模型的性能。
循環神經網絡語言模型能解決傳統N-gram模型中存在的數據稀疏和維數災難問題,但仍缺乏對長距離信息的描述能力。為此文中提齣一種基于詞嚮量特徵的循環神經網絡語言模型改進方法。該方法在輸入層中增加特徵層,改進模型結構。在模型訓練時,通過特徵層加入上下文詞嚮量,增彊網絡對長距離信息約束的學習能力。實驗錶明,文中方法能有效提高語言模型的性能。
순배신경망락어언모형능해결전통N-gram모형중존재적수거희소화유수재난문제,단잉결핍대장거리신식적묘술능력。위차문중제출일충기우사향량특정적순배신경망락어언모형개진방법。해방법재수입층중증가특정층,개진모형결구。재모형훈련시,통과특정층가입상하문사향량,증강망락대장거리신식약속적학습능력。실험표명,문중방법능유효제고어언모형적성능。
The recurrent neural network language model( RNNLM) solves the problems of data sparseness and dimensionality disaster in traditional N-gram models. However, the original RNNLM is still lack of long dependence due to the vanishing gradient problem. In this paper, an improved method based on contextual word vectors is proposed for RNNLM. To improve the structure of models, a feature layer is added into the input layer. Contextual word vectors are added into the model with feature layer to reinforce the ability of learning long-distance information during the training. Experimental results show that the proposed method effectively improves the performance of RNNLM.