月旦知識庫
 
  1. 熱門:
 
首頁 臺灣期刊   法律   公行政治   醫事相關   財經   社會學   教育   其他 大陸期刊   核心   重要期刊 DOI文章
中文計算語言學期刊 本站僅提供期刊文獻檢索。
  【月旦知識庫】是否收錄該篇全文,敬請【登入】查詢為準。
最新【購點活動】


篇名
應用多跳躍注意記憶關聯於記憶網路之研究
並列篇名
A Research of Applying Multi-hop Attention and Memory Relations on Memory Networks
作者 詹京翰劉立頌李俊宏
中文摘要
機器學習與深度學習近年發展越來越迅速,在自然語言處理任務上取得相當大的突破。透過類神經網路可以實現複雜的語言任務,如文章分類、摘要提取、問答任務、機器翻譯、圖片說明生成等。本論文以記憶網路做為研究目標、問答任務作為驗證應用。模型將先驗知識保存於記憶中,再透過注意力機制(Attention Mechanism)找出與問題相關的記憶,並推理出最終答案。問答任務數據集採用Facebook所提供的bAbI數據集,其中共有20項不同種類的問答任務,可驗證模型在不同任務的準確率。此研究透過記憶間的關聯計算,縮減記憶關聯的數量,除了下降26.8%權重的計算量外,也能提高模型的準確率,於實驗中最多可提高約9.2%左右。同時實驗採取較小的數據量作為驗證目標,改善即使在數據集不足的情況也能達到相當程度的改善效果。
英文摘要
With the rapid advancement of machine learning and deep learning, a great breakthrough has been achieved in many areas of natural language processing in recent years. Complex language tasks, such as article classification, abstract extraction, question answering, machine translation, and image description generation, have been solved by neural networks. In this paper, we propose a new model based on memory networks to include a multi-hop mechanism to process a set of sentences in small quantity, and the question-answering task is used as the verification application. The model saves the knowledge in memory first and then finds the relevant memory through the attention mechanism, and the output module reasons the final answer. All experiments have used the bAbI dataset provided by Facebook. There are 20 different kinds of Q&A tasks in the data set that can be used to evaluate the model in different aspects. This approach reduces the number of memory associations through the calculation of associations between memories. In addition to reducing the calculation weight of 26.8%, it can also improve the accuracy of the model, which can increase by about 9.2% in the experiment. The experiments also used a smaller amount of data to verify the system for improving the case of insufficient data set.
起訖頁 103-122
關鍵詞 記憶網路多點跳躍網路關係網路注意力機制Memory NetworksMulti-hop NetworksRelation NetworksAttention Mechanism
刊名 中文計算語言學期刊  
期數 202006 (25:1期)
出版單位 中華民國計算語言學學會
該期刊-上一篇 Linguistic Input and Child Vocalization of 7 Children from 5 to 30 Months: A Longitudinal Study with LENA Automatic Analysis
 

新書閱讀



最新影音


優惠活動




讀者服務專線:+886-2-23756688 傳真:+886-2-23318496
地址:臺北市館前路28 號 7 樓 客服信箱
Copyright © 元照出版 All rights reserved. 版權所有,禁止轉貼節錄