月旦知識庫
 
  1. 熱門:
 
首頁 臺灣期刊   法律   公行政治   醫事相關   財經   社會學   教育   其他 大陸期刊   核心   重要期刊 DOI文章
ROCLING論文集 本站僅提供期刊文獻檢索。
  【月旦知識庫】是否收錄該篇全文,敬請【登入】查詢為準。
最新【購點活動】


篇名
上下文語言模型化技術於常見問答檢索之研究
並列篇名
A Study on Contextualized Language Modeling for FAQ Retrieval
作者 曾琬婷許永昌陳柏琳
中文摘要
近年來,深度學習技術有突破性的發展,並在很多自然語言處理的相關應用領域上也有相當亮眼的效能表現,例如FAQ(Frequently Asked Question)檢索任務。FAQ檢索無論在電子商務服務或是線上論壇等許多領域都有廣泛的應用;其目的在於依照使用者的查詢(問題)來提供相對應最適合的答案。至今,已有出數種FAQ檢索的策略被提出,像是透過比較使用者查詢和標準問句的相似度、使用者查詢與標準問句對應的答案之間相關性,或是將使用者查詢做分類。因此,也有許多新穎的基於上下文的深層類神經網路語言模型被用於以實現上述策略;例如,BERT(Bidirectional Encoder Representations from Transformers),以及它的延伸像是K-BERT或是Sentence-BERT等。儘管BERT以及它的延伸在FAQ檢索任務上已獲得不錯的效果,但是對於需要一般領域知識的FAQ任務仍有改進空間。因此,本論文中探討如何透過使用知識圖譜等的額外資訊來強化BERT在FAQ檢索任務上之效能,並同時比較不同策略和方法的結合在FAQ檢索任務之表現。
英文摘要
Recent years have witnessed significant progress in the development of deep learning techniques, which also has achieved state-of-the-art performance for a wide variety of natural language processing (NLP) applications like the frequently asked question (FAQ) retrieval task. FAQ retrieval, which manages to provide relevant information in response to frequent questions or concerns, has far-reaching applications such as e-commerce services and online forums, among many other applications. In the common setting of the FAQ retrieval task, a collection of question-answer (Q-A) pairs compiled in advance can be capitalized to retrieve an appropriate answer in response to a user's query that is likely to reoccur frequently. To date, there have many strategies proposed to approach FAQ retrieval, ranging from comparing the similarity between the query and a question, to scoring the relevance between the query and the associated answer of a question, and performing classification on user queries. As such, a variety of contextualized language models have been extended and developed to operationalize the aforementioned strategies, like BERT (Bidirectional Encoder Representations from Transformers), K-BERT and Sentence-BERT. Although BERT and its variants has demonstrated reasonably good results on various FAQ retrieval tasks, they still would fall short for some tasks that may resort to generic knowledge. In view of this, in this paper, we set out to explore the utility of injecting an extra knowledge base into BERT for FAQ retrieval, meanwhile comparing among synergistic effects of different strategies and methods.
起訖頁 1-13
關鍵詞 常見問答集檢索知識圖譜自然語言處理資訊檢索深度學習Frequently Asked QuestionKnowledge GraphNatural Language ProcessingInformation RetrievalDeep Learning
刊名 ROCLING論文集  
期數 2020 (2020期)
出版單位 中華民國計算語言學學會
該期刊-上一篇 Nepali Speech Recognition Using CNN, GRU and CTC
該期刊-下一篇 French and Russian students' production of Mandarin tones
 

新書閱讀



最新影音


優惠活動




讀者服務專線:+886-2-23756688 傳真:+886-2-23318496
地址:臺北市館前路28 號 7 樓 客服信箱
Copyright © 元照出版 All rights reserved. 版權所有,禁止轉貼節錄