以有限配對資料訓練事實問題生成模型之研究

dc.contributor柯佳伶zh_TW
dc.contributorKoh, Jia-Lingen_US
dc.contributor.author蕭雅方zh_TW
dc.contributor.authorHsiao, Ya-Fangen_US
dc.date.accessioned2020-10-19T06:59:29Z
dc.date.available2021-06-30
dc.date.available2020-10-19T06:59:29Z
dc.date.issued2020
dc.description.abstract本論文考慮在閱讀文句與對應問題的配對資料有限情況下,透過遷移式學習概念,利用未配對的資料增強編碼器-解碼器架構模型的學習效果,使模型仍能生成相當於輸入大量配對資料訓練後的生成效果。本研究採用序列對序列模型,先以非監督式學習方式,利用大量無需經過標記的文句和問題,訓練自動編碼器架構。接著,擷取出預訓練好能理解文句的編碼器及生成問題的解碼器進行組合,並對編碼器加入轉移層建構出新的模型,再以遷移式學習選用文句與問題配對訓練微調模型參數。實驗結果顯示,採用本論文設計的遷移式學習方式,並配合訓練策略,在減少一半文句與問題配對資料的訓練,仍比直接採用全部配對訓練資料進行訓練得到的問題生成模型有更佳效果。zh_TW
dc.description.abstractIn real applications, there is usually not a large number of sentence and question pairs for training a question generation model. To solve the problem, we adopt the network-based transfer learning by using unpaired training data to enhance the learning effect of the encoder-decoder model. Accordingly, the obtained model still achieves the similar generation effect by comparing with the model which is directly trained by a large amount of paired data. In this study, we using a large number of sentences and questions that do not need to be labeled as pairs to train two auto-encoders, respectively. Then we combine the pre-trained encoder which encodes the semantics of sentence and the pre-trained decoder which generates the question. Next, by inserting a transfer layer to the encoder and fine-tune the model parameters by a fewer number of paired data. The results of experiments show that, by applying the designed training strategies, the question generation model trained by less than half of the paired training data still achieves a better performance than the model directly trained by using all the training data.en_US
dc.description.sponsorship資訊工程學系zh_TW
dc.identifierG060747044S
dc.identifier.urihttp://etds.lib.ntnu.edu.tw/cgi-bin/gs32/gsweb.cgi?o=dstdcdr&s=id=%22G060747044S%22.&%22.id.&
dc.identifier.urihttp://rportal.lib.ntnu.edu.tw:80/handle/20.500.12235/111738
dc.language中文
dc.subject問題生成zh_TW
dc.subject深度學習zh_TW
dc.subject自然語言處理zh_TW
dc.subject語言模型zh_TW
dc.subject遷移學習zh_TW
dc.subjectQuestion Generationen_US
dc.subjectDeep Learningen_US
dc.subjectNatural Language Processingen_US
dc.subjectLanguage Modelen_US
dc.subjectTransfer Learningen_US
dc.title以有限配對資料訓練事實問題生成模型之研究zh_TW
dc.titleFatual Question Generation Model Construction With Limited Paired Training Dataen_US

Files

Collections