Font Size: a A A

Attentional Factorization Machines

Posted on:2019-04-01Degree:MasterType:Thesis
Country:ChinaCandidate:H YeFull Text:PDF
GTID:2428330548979818Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Factorization Machines(FMs)are a supervised learning approach that enhances the linear regression model by incorporating the second-order feature interactions.Despite effectiveness,FM can be hindered by its modelling of all feature interactions with the same weight,as not all feature interactions are equally useful and predictive.For exam-ple,the interactions with less useful features may even introduce noises and adversely degrade the performance.In this work,we improve FM by discriminating the importance of different feature interactions.We propose a novel model named Attentional Factor-ization Machine(AFM),which learns the importance of each feature interaction from data via a neural attention network.In addition,we propose a variant named Deep Atten-tional Factorization Machine(DAFM),which captures higher-order feature interactions by additional fully connected layers.Extensive experiments on two real-world datasets demonstrate the effectiveness of AFM.Empirically,it is shown on regression tasks AFM betters FM with a 8.5%relative improvement,and consistently outperforms the state-of-the-art deep learning methods Wide&Deep[1]and DeepCross[2]with a much simpler structure and fewer model parameters,which has been published on IJCAI 2017.On classification tasks,the attention mechanism also brings a 8.5%relative improvement,and makes AFM beat Wide&Deep and DeepCross as well;meanwhile,DAFM achieves a further improvement on some datasets due to the advantage of capturing higher-order feature interactions.
Keywords/Search Tags:Factorization Machines, Attention Network, Feature Interaction, Neural Network, Mahchine Learning
PDF Full Text Request
Related items