Skip-Gram and Transformer Model for Session-Based Recommendation


Celik E., İLHAN OMURCA S.

APPLIED SCIENCES-BASEL, cilt.14, sa.14, 2024 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 14 Sayı: 14
  • Basım Tarihi: 2024
  • Doi Numarası: 10.3390/app14146353
  • Dergi Adı: APPLIED SCIENCES-BASEL
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Aerospace Database, Agricultural & Environmental Science Database, Applied Science & Technology Source, Communication Abstracts, INSPEC, Metadex, Directory of Open Access Journals, Civil Engineering Abstracts
  • Kocaeli Üniversitesi Adresli: Evet

Özet

Session-based recommendation uses past clicks and interaction sequences from anonymous users to predict the next item most likely to be clicked. Predicting the user's subsequent behavior in online transactions becomes a problem mainly due to the lack of user information and limited behavioral information. Existing methods, such as recurrent neural network (RNN)-based models that model user's past behavior sequences and graph neural network (GNN)-based models that capture potential relationships between items, miss different time intervals in the past behavior sequence and can only capture certain types of user interest patterns due to the characteristics of neural networks. Graphic models created to improve the current session reduce the model's success due to the addition of irrelevant items. Moreover, attention mechanisms in recent approaches have been insufficient due to weak representations of users and products. In this study, we propose a model based on the combination of skip-gram and transformer (SkipGT) to solve the above-mentioned drawbacks in session-based recommendation systems. In the proposed method, skip-gram both captures chained user interest in the session thread through item-specific subreddits and learns complex interaction information between items. The proposed method captures short-term and long-term preference representations to predict the next click with the help of a transformer. The transformer in our proposed model overcomes many limitations in turn-based models and models longer contextual connections between items more effectively. In our proposed model, by giving the transformer trained item embeddings from the skip-gram model as input, the transformer has better performance because it does not learn item representations from scratch. By conducting extensive experiments with three real-world datasets, we confirm that SkipGT significantly outperforms state-of-the-art solutions with an average MRR score of 5.58%.