Font Size: a A A

Research On Recommendation Algorithms Of Deep Neural Network Based On Attention Mechanism

Posted on:2022-01-05Degree:MasterType:Thesis
Country:ChinaCandidate:Y ZhangFull Text:PDF
GTID:2518306602965489Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
In recent years,the booming development of recommender system has completely changed the way we shop.With the gradual introduction of e-commerce into all aspects of our lives,recommendation algorithms begin to play a decisive role in the major Internet platforms and their application algorithms.The recommender system acts on the interaction history data of users and is committed to excavating the potential needs of users and making targeted recommendations.However,users' interest has long-term trends and short-term peaks,so how to accurately describe the expression of users' interest has become the core topic of building an efficient recommendation algorithm.There are two problems in the previous works: First,the sequence of historical interactions is regarded as a whole embedding,and each item in the embedding is included in the calculation with the same weight without distinction.This single embedding can only simulate the user's overall interest but cannot reflect the evolution and peak of interest.Second,most of the multi-interest modeling methods completely abandon the time and sequence information and only take the simple aggregation results of complex and scattered interest points as recommendations.In view of the above problem,this thesis firstly researches the existing neural network models to the depth,especially the attention mechanism of the model and its principle,and then by adopting the idea of the capsule and attention mechanism,this thesis proposes a new sequential recommendation model based on the Transformer,it is more interested in polymerization capsule module and sequence feature extractor Transformer module to carry on the effective integration and effectively solves the two problems.Aiming at the problem of long and short-term multi-interest modeling,this thesis proposes a multi-interest capsule network module,which can model the related candidate items in the large item pool into several intensive interest capsules.It uses the aggregation module to aggregate different interest items into the interest representation of the capsule class cluster according to the similarity and relevance,and then captures the potential connection between the interest capsule representation and the user's current interest peak through the attention activation module.To solve the problem of loss of temporal information by aggregation method,this thesis proposes a Transformer module based on self-attention mechanism.By adding the relative position embedding information when building the original low-dimensional item embedding,the time series information is retained and the trend of interest evolving with time is constructed.This module learns the overall potential representation of sequences through multi-layer self-attention modules and Transformer module which is built by feedforward network stacking.In this thesis,a comprehensive experiment and analysis of the proposed algorithm are carried out using a real public data set.The effectiveness of each functional module is proved by a controlled ablation experiment,and the working mode of each component is explained.In compared with algorithm based on the several typical classic recommended after the experiment,the experimental results and analysis show that the proposed algorithm in accuracy of HR and NDCG obtained better performance in two indicators,average won 7.2% of the HR and 10.0% of NDCG to ascend,and the convergence rate and model training efficiency is better than the vast majority of the algorithm,which shows the algorithm presented in this thesis in the generated task on the superiority of the recommended candidate items.
Keywords/Search Tags:Recommender System, Deep Neural Network, Transformer, Multi-Interest Capsule, Self-Attention Mechanism
PDF Full Text Request
Related items