To read this content please select one of the options below:

Optimized aspect and self-attention aware LSTM for target-based semantic analysis (OAS-LSTM-TSA)

B. Vasavi (Vardhaman College of Engineering, Hyderabad, India)
P. Dileep (Malla Reddy College of Engineering & Technology, Hyderabad, India)
Ulligaddala Srinivasarao (Department of Computer Science and Engineering, GITAM (Deemed to be) UNIVERSITY, Hyderabad, India)

Data Technologies and Applications

ISSN: 2514-9288

Article publication date: 29 December 2023

44

Abstract

Purpose

Aspect-based sentiment analysis (ASA) is a task of sentiment analysis that requires predicting aspect sentiment polarity for a given sentence. Many traditional techniques use graph-based mechanisms, which reduce prediction accuracy and introduce large amounts of noise. The other problem with graph-based mechanisms is that for some context words, the feelings change depending on the aspect, and therefore it is impossible to draw conclusions on their own. ASA is challenging because a given sentence can reveal complicated feelings about multiple aspects.

Design/methodology/approach

This research proposed an optimized attention-based DL model known as optimized aspect and self-attention aware long short-term memory for target-based semantic analysis (OAS-LSTM-TSA). The proposed model goes through three phases: preprocessing, aspect extraction and classification. Aspect extraction is done using a double-layered convolutional neural network (DL-CNN). The optimized aspect and self-attention embedded LSTM (OAS-LSTM) is used to classify aspect sentiment into three classes: positive, neutral and negative.

Findings

To detect and classify sentiment polarity of the aspect using the optimized aspect and self-attention embedded LSTM (OAS-LSTM) model. The results of the proposed method revealed that it achieves a high accuracy of 95.3 per cent for the restaurant dataset and 96.7 per cent for the laptop dataset.

Originality/value

The novelty of the research work is the addition of two effective attention layers in the network model, loss function reduction and accuracy enhancement, using a recent efficient optimization algorithm. The loss function in OAS-LSTM is minimized using the adaptive pelican optimization algorithm, thus increasing the accuracy rate. The performance of the proposed method is validated on four real-time datasets, Rest14, Lap14, Rest15 and Rest16, for various performance metrics.

Keywords

Acknowledgements

Funding: This research did not receive any specific grant from a funding agency in the public, commercial or not for the public sector.

Conflict of interest: The authors declare no potential conflict of interest.

Citation

Vasavi, B., Dileep, P. and Srinivasarao, U. (2023), "Optimized aspect and self-attention aware LSTM for target-based semantic analysis (OAS-LSTM-TSA)", Data Technologies and Applications, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/DTA-10-2022-0408

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Emerald Publishing Limited

Related articles