STEMM Institute Press
Science, Technology, Engineering, Management and Medicine
Spatial-Temporal Aware Disaster Topic Recognition in Social Media Text Using SageMaker
DOI: https://doi.org/10.62517/jbdc.202301313
Author(s)
Zheng He1, Zhifei Luo2, Lin Li2
Affiliation(s)
1Deloitte Consulting Co., Ltd., Shanghai, China 2Wuhan University of Technology, Wuhan, Hubei, China
Abstract
With the popularization and widespread use of social media, a large amount of information is shared and disseminated by users on the platforms, including disaster-related text messages. The disaster information contained in these social media text messages is of great value for emergency response, disaster research, and the understanding of public opinion. In the task of text classification of social media disaster topics, previous approaches generally only semantically analyze the text itself to determine its relevance to the disaster, and this paper considers a new perspective that combines the spatio-temporal attributes of social media texts and textual information to complete the text classification task. Based on the above idea, this paper constructs a heterogeneous graph on the dataset using the temporal and spatial attributes of blog posts, initializes the blog nodes using BERT features, and learns the blog features through relational graph convolution operation. Experiments on the SageMaker platform use BERT as a baseline model, while blog post features are initialized using this model. The experimental results show that the spatio-temporal model achieves better performance compared to the baseline model.
Keywords
Disaster Detection; Social Media; Graph Convolutional Networks; Sagemaker; Text Categorization
References
[1] Krizhevsky A, Sutskever I, Hinton G E. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems, 2012, 25. [2] Yoon Kim. Convolutional Neural Networks for Sentence Classification. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP).2014:1746-1751. [3] Devlin J, Chang M W, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805, 2018. [4] Qu C, Yang L, Qiu M, et al. BERT with history answer embedding for conversational question answering. Proceedings of the 42nd international ACM SIGIR conference on research and development in information retrieval. 2019: 1133-1136. [5] Schlichtkrull M, Kipf T N, Bloem P, et al. Modeling relational data with graph convolutional networks. The Semantic Web: 15th International Conference, ESWC 2018, Heraklion, Crete, Greece, June 3-7, 2018, Proceedings 15. Springer International Publishing, 2018: 593-607. [6] Mehta N, Pacheco M L, Goldwasser D. Tackling fake news detection by continually improving social context representations using graph neural networks. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics. 2022: 1363-1380. [7] Yao L, Mao C, Luo Y. Graph convolutional networks for text classification. Proceedings of the AAAI conference on artificial intelligence. 2019, 33 (01): 7370-7377. [8] Wu T, Liu Q, Cao Y, et al. Continual Graph Convolutional Network for Text Classification. arXiv preprint arXiv:2304.04152, 2023. [9] Huang L, Ma D, Li S, et al. Text level graph neural network for text classification. arXiv preprint arXiv:1910.02356, 2019. [10] Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016. [11] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need. Advances in neural information processing systems, 2017, 30.
Copyright @ 2020-2035 STEMM Institute Press All Rights Reserved