Memory Network with Multi-Head Attention for Chatbot
CSTR:
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Modeling and reasoning about the multi-turn dialogue history is a main challenge for building an intelligent chatbot. Memory Networks with recurrent or gated architectures have been demonstrated promising for conversation modeling. However, it still suffers from two drawbacks, one is relatively low computational efficiency for its complex architectures, the other is costly strong supervision information or fixed priori knowledge, which hinders its extension and application to new domains. This paper proposes an end-to-end memory network with multi-head attention. Firstly, the model adopts a method using word embedding combined with position encoding to represent text input; Secondly, it uses multi-head attention to capture important information in different subspaces of conversational interactions. Finally, multi-layered attention is stacked via shortcut connections to achieve repeatedly reasoning over the modeling result. Experiments on the bAbI-dialog datasets show that the network can effectively model and reason for multi-turn dialogue and has a better time performance.

    Reference
    Related
    Cited by
Get Citation

任建龙,杨立,孔维一,左春.面向聊天机器人的多注意力记忆网络.计算机系统应用,2019,28(9):18-24

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:February 27,2019
  • Revised:March 22,2019
  • Adopted:
  • Online: September 09,2019
  • Published: September 15,2019
Article QR Code
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063