帳號:guest(3.14.70.203)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目勘誤回報
作者:張宇宏
作者(英文):Yu-Hong Zhang
論文名稱:以BERT結合情緒字典應用於情緒分析之研究
論文名稱(英文):Combining BERT with Sentiment Lexicon for Sentiment Analysis
指導教授:江政欽
指導教授(英文):CHENG-CHIN CHIANG
口試委員:黃雅軒
林信鋒
口試委員(英文):Ya-Xuan Huang
Shin-Feng Lin
學位類別:碩士
校院名稱:國立東華大學
系所名稱:資訊工程學系
學號:610721224
出版年(民國):109
畢業學年度:108
語文別:中文
論文頁數:31
關鍵詞:注意力機制情緒辭典情緒分析類神經網路
關鍵詞(英文):Attention MechanismSentiment LexiconSentiment AnalysisNeural Network
相關次數:
  • 推薦推薦:0
  • 點閱點閱:40
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:28
  • 收藏收藏:0
文本情緒分析(Textual Sentiment Analysis)是一件擬讓機器從文句中判別發文者當下情緒與情感反應的有趣工作,技術上而言,此問題可被當作一種將不同的字詞歸類為不同情緒種類的問題。但一個文句中常參雜多種不同詞類,甚至會包含不少與情緒無關的冗贅文字,要精準判別文句中的情緒並不容易,不少研究人員已嘗試利用機器學習技術來解決這個問題。在前人的研究中指出,應用專注力機制(Attention Mechanism)結合情緒辭典(Sentiment Lexicon Dictionary)的長短時記憶(Long Short-Term Memory,LSTM)神經網路模型可有不錯的效果,鑒於LSTM並非目前處理文句資料的主流神經網路模型,而且該方法僅能將情緒辭典直接套用,並未能依據文句語料作動態調適,所以本論文提出使用BERT模型來改進LSTM的專注力機制,並針對情緒辭典設計更好的調適機制,讓它可根據文句重新編碼以達到動態適能的目的,實驗結果顯示本論文提出之方法準確度皆超越先前最佳的模型。
Text sentiment analysis is an interesting topic that allows a machine to recognize the current sentiment and emotional reaction of the writer from the sentences. Technically, this problem can be regarded as a problem of classifying different sentences into different emotion types. However, a sentence is often mixed with many different parts of speech, and even contains too many words that are not related to emotions. It is not easy to accurately recognize emotions in a sentence, which often contains many different words or even redundant words that have nothing to do with emotions. Many researchers have attempted to solve this problem by using machine learning techniques. According to previous studies, using the Long Short-Term Memory (LSTM) combined with the Attention mechanism and the Sentiment Lexicon can be quite effective. Nonetheless, LSTM is currently not the most effective neural network model for processing textual data. It can only be applied to an emotional lexicon dictionary in a limited and specific domain. To incorporate the power of both the sentiment lexicon and the wide-domain textual corpus, this thesis proposes a BERT model to improve the simple mechanism of LSTM and to design an adaptive mechanism for the sentiment lexicon to better adapt the encoding of the lexicon entries to our textual corpus. Experimental results show that our method outperform the state-of-the-art models.
誌謝 I
摘要 II
Abstract III
圖目錄 V
表目錄 VI
第1章 緒論 1
1.1 研究背景 1
1.2 研究動機 2
第2章 文獻探討 3
2.1 情緒分析 3
2.2專注力機制文本分析模型 5
2.2.1 專注力機制 5
2.2.2 Transformer 7
2.3 BERT 8
2.4 使用情緒辭典之類神經網路 9
2.5 本研究之方法設計思考背景 10
第3章 情緒引導之 BERT(Sentiment Guided BERT) 11
3.1 整體架構 11
3.1.1 硬助推 (Hard Boost) 14
3.1.2 軟助推 (Soft Boost) 15
3.1.3 混合助推 (Merge Boost) 16 本研究提出之方法 (Proposed Method) 18
3.2.1 可學習式情緒字典嵌入 (Learnable Sentiment Lexicon Embedding) 19
3.2.2 混合因子自動調適 (Automatic Blending Factor Adaptation) 19
第4章 實驗設置 23
4.1情緒字典與資料集 23
4.1.1 情緒字典 23
4.1.2 資料集- Stanford Sentiment Treebank 24
4.2 類神經網路架構 25
4.3 訓練方式 26
第5章 實驗結果與分析 27 比較模型 27
實驗結果 28
模型分析 29
第6章 結論與未來方向 33
參考文獻 35
[1] M. Soleymani, D. Garcia, B. Jou, B. Schuller, S.-F. Chang, and M. Pantic, ‘A survey of multimodal sentiment analysis’, Image and Vision Computing, vol. 65, pp. 3–14, Sep. 2017, doi: 10.1016/j.imavis.2017.08.003.
[2] V. S. Pagolu, K. N. R. Challa, G. Panda, and B. Majhi, ‘Sentiment Analysis of Twitter Data for Predicting Stock Market Movements’, arXiv:1610.09225 [cs], Oct. 2016, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1610.09225.
[3] S. Mohammad, S. Kiritchenko, and X. Zhu, ‘NRC-Canada: Building the State-of-the-Art in Sentiment Analysis of Tweets’, in Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013), Atlanta, Georgia, USA, Jun. 2013, pp. 321–327.
[4] A. Ratnaparkhi, ‘A Maximum Entropy Model for Part-Of-Speech Tagging’, 1996, Accessed: Aug. 09, 2020. [Online]. Available: https://www.aclweb.org/anthology/W96-0213.
[5] T. Mullen and N. Collier, ‘Sentiment Analysis using Support Vector Machines with Diverse Information Sources’, in Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing, Barcelona, Spain, Jul. 2004, pp. 412–418, Accessed: Aug. 09, 2020. [Online]. Available: https://www.aclweb.org/anthology/W04-3253.
[6] Y. Kim, ‘Convolutional Neural Networks for Sentence Classification’, arXiv:1408.5882 [cs], Sep. 2014, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1408.5882.
[7] K. Greff, R. K. Srivastava, J. Koutnik, B. R. Steunebrink, and J. Schmidhuber, ‘LSTM: A Search Space Odyssey’, IEEE Trans. Neural Netw. Learning Syst., vol. 28, no. 10, pp. 2222–2232, Oct. 2017, doi: 10.1109/TNNLS.2016.2582924.
[8] Szu-Hung Wang, ‘Sentiment-Guided Attention Mechanism for Sentiment Analysis’. National Taiwan University Graduate Institute of Networking and Multimedia, Jan. 01, 2019.
[9] A. Vaswani et al., ‘Attention Is All You Need’, arXiv:1706.03762 [cs], Dec. 2017, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1706.03762.
[10] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, ‘BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding’, arXiv:1810.04805 [cs], May 2019, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1810.04805.
[11] S. J. Pan and Q. Yang, ‘A Survey on Transfer Learning’, IEEE Trans. Knowl. Data Eng., vol. 22, no. 10, pp. 1345–1359, Oct. 2010, doi: 10.1109/TKDE.2009.191.
[12] M. Hu and B. Liu, ‘Mining and Summarizing Customer Reviews’, p. 10.
[13] J. Barnes, R. Klinger, and S. S. im Walde, ‘Assessing State-of-the-Art Sentiment Models on State-of-the-Art Sentiment Datasets’, arXiv:1709.04219 [cs], Sep. 2017, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1709.04219.
[14] B. Felbo, A. Mislove, A. Søgaard, I. Rahwan, and S. Lehmann, ‘Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm’, Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 1615–1625, 2017, doi: 10.18653/v1/D17-1169.
[15] D. Bahdanau, K. Cho, and Y. Bengio, ‘Neural Machine Translation by Jointly Learning to Align and Translate’, arXiv:1409.0473 [cs, stat], May 2016, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1409.0473.
[16] I. Sutskever, O. Vinyals, and Q. V. Le, ‘Sequence to Sequence Learning with Neural Networks’, in Advances in Neural Information Processing Systems 27, Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, and K. Q. Weinberger, Eds. Curran Associates, Inc., 2014, pp. 3104–3112.
[17] B. Shin, T. Lee, and J. D. Choi, ‘Lexicon Integrated CNN Models with Attention for Sentiment Analysis’, arXiv:1610.06272 [cs], Aug. 2017, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1610.06272.
[18] J. Yoon and H. Kim, ‘Multi-Channel Lexicon Integrated CNN-BiLSTM Models for Sentiment Analysis’, p. 10.
[19] R. Socher et al., ‘Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank’, in Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, Seattle, Washington, USA, Oct. 2013, pp. 1631–1642, Accessed: Aug. 09, 2020. [Online]. Available: https://www.aclweb.org/anthology/D13-1170.
[20] T. Wolf et al., ‘HuggingFace’s Transformers: State-of-the-art Natural Language Processing’, arXiv:1910.03771 [cs], Jul. 2020, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1910.03771.
[21] A. Paszke et al., ‘PyTorch: An Imperative Style, High-Performance Deep Learning Library’, in Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d\textquotesingle Alché-Buc, E. Fox, and R. Garnett, Eds. Curran Associates, Inc., 2019, pp. 8026–8037.
[22] M. Munikar, S. Shakya, and A. Shrestha, ‘Fine-grained Sentiment Classification using BERT’, in 2019 Artificial Intelligence for Transforming Business and Society (AITB), Kathmandu, Nepal, Nov. 2019, pp. 1–5, doi: 10.1109/AITB48515.2019.8947435.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *