Font Size: a A A

Degree-of-interest Model In Live Shopping Based On Eye Movement Characteristics And DeepFM

Posted on:2022-12-24Degree:MasterType:Thesis
Country:ChinaCandidate:H LiFull Text:PDF
GTID:2518306779471784Subject:Trade Economy
Abstract/Summary:PDF Full Text Request
Online live shopping has now become one of the channels for people to obtain information on their daily shopping.Knowing the user's interest in the live broadcast process can not only improve the business' s live broadcast strategy and enhance the user's satisfaction in watching the live broadcast,but also help the designer to develop a more humane live broadcast interaction method,improve the user experience,and generate greater economic benefits for the enterprise.It is of great practical significance for Internet companies and store live broadcast departments to study users' interest in watching shopping live broadcasts and build a stable degree-of-interest model.The degree-of-interest model usually considers the behavioral characteristics of users.The eye movement characteristics during the live broadcast process can reflect people's attention to the product,which is a very important behavioral characteristic.However,due to the difficulty of equipment collection and other reasons,the existing degree-of-interest model does not consider eye movement features.In order to more effectively obtain the user's interest in the product,the paper introduces several key indicators reflecting the characteristics of eye movement as important factors in the interest model based on the Deep FM architecture design.Comparing experiments with existing models on multiple data sets,the results show that the improved model proposed in this paper has a lower Logloss value and better AUC performance.Finally,the paper also designs and implements an eye tracking data acquisition and implementation method based on the model.Specifically,the research results of this paper mainly include the following four aspects:(1)This paper aimed at the defect that existing eye trackers cannot automatically process eye movement data in all dimensions,a set of eye movement data acquisition system FDIMP is designed to automatically extract eye movement parameters such as time rate.First,the program frames the tracking target area in the first frame of the video,and then the program uses the improved deep learning model DIMP to track and locate the position of the tracking target in each subsequent frame of the video;then,the program counts the attention time in the whole process through the attention time program;finally,the final output time rate and other eye movement dimension parameters.(2)A new interestingness model EDIPDF model is proposed.It is difficult for the Deep FM model to pay attention to some associations in different dimensional data,and it is difficult to highlight important features when there are many types of feature information.In order to solve the above problems,this paper proposes an EDIPDF degree-of-interest model,which integrates multi-category information by adding a multi-modal knowledge graph module on the basis of Deep FM,and introduces an attention mechanism to effectively learn important features.(3)Based on two different data sets,the EDIPDF model proposed in this paper is compared with other classic models.The experimental results show that the EDIPDF model proposed in this paper is better than other models.Accuracy has been improved.Compared with the native Deep FM algorithm,the AUC value is improved by9.32% at most.This paper also conducts experiments on the impact of different improvement points on the model performance,and analyzes the impact of the fully connected layer and the number of iterations on the model performance.(4)Based on the EDIPDF model,an eye movement data acquisition and interest prediction system was designed and developed.This paper discusses the front-end and back-end frameworks and related technologies used by the system in detail,and shows the page pictures and functions of each module.
Keywords/Search Tags:Eye movement data, Interest model, Attention mechanism, Multimodal information
PDF Full Text Request
Related items