Tweets Topic Classification and Sentiment Analysis Based on Transformer-Based Language Models

Ranju Mandal, Jinyan Chen, Susanne Becken, Bela Stantic

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

People provide information on their thoughts, perceptions, and activities through a wide range of channels, including social media. The wide acceptance of social media results in vast volume of valuable data, in variety of format as well as veracity. Analysis of such 'big data' allows organizations and analysts to make better and faster decisions. However, this data had to be quantified and information has to be extracted, which can be very challenging because of possible data ambiguity and complexity. To address information extraction, many analytic techniques, such as text mining, machine learning, predictive analytics, and diverse natural language processing, have been proposed in the literature. Recent advances in Natural Language Understanding-based techniques more specifically transformer-based architectures can solve sequence-to-sequence modeling tasks while handling long-range dependencies efficiently. In this work, we applied transformer-based sequence modeling on short texts' topic classification and sentiment analysis from user-posted tweets. Applicability of models is investigated on posts from the Great Barrier Reef tweet dataset and obtained findings are encouraging providing insight that can be valuable for researchers working on classification of large datasets as well as large number of target classes.

Original languageEnglish
Pages (from-to)117-134
Number of pages18
JournalVietnam Journal of Computer Science
Volume10
Issue number2
DOIs
Publication statusPublished - 1 May 2023

Keywords

  • deep learning
  • natural language processing
  • target classification
  • topic classification
  • Transformer

Fingerprint

Dive into the research topics of 'Tweets Topic Classification and Sentiment Analysis Based on Transformer-Based Language Models'. Together they form a unique fingerprint.

Cite this