Font Size: a A A

QoS Modeling And QoS-Aware Web API Recommendation

Posted on:2023-09-29Degree:MasterType:Thesis
Country:ChinaCandidate:J P ChenFull Text:PDF
GTID:2568306617983559Subject:Computer technology
Abstract/Summary:PDF Full Text Request
In the field of web development,mashup refers to calling multiple open web APIs and combine them to create new services with more advanced or richer functions.As more and more Web applications publish open APIs for developers to call,how to choose a suitable Web API to support Mashup development has become a research hotspot.In response to this problem,researchers have proposed a variety of different algorithms and models from the data perspective and modeling level,and suggested Web API candidate sets for developers according to their needs.However,the feature fusion and feature enhancement of Web API,especially how to model and characterize the quality features of Web API,need further research.To this end,this paper proposes corresponding solutions by combining pre-training models,Web API feature fusion and enhancement and knowledge distillation techniques.Firstly,based on the pre-trained model,a deep model BPT for Web API recommendation is proposed.The BERT model is used to model the text description information of Web API and Mashup,and the Web API recommendation is completed based on the obtained features.Then,considering the impact of service quality information on the experience of using Web API,a deep recommendation model QAR for Web API quality awareness is proposed.Based on BERT as a text encoder to achieve feature extraction,the service quality information is represented as an embedded vector through sparse coding and other technologies,and the Web API quality information features and text features are effectively fused with the help of attention mechanism,and then get the enhanced quality features-based Web API recommendation.Finally,from the perspective of model lightweight,the BERT model is compressed by distillation,and a QAR compression model QADR based on knowledge distillation is proposed,which further accelerates the model convergence speed.Through the test results obtained on the real data set,from the accuracy,recall and normalized discounted cumulative gain show that the proposed model is superior to other advanced models.The extraction of text features and the use of quality information greatly improves the recommendation accuracy and stability.In addition,the comparative experiments based on QAR and QADR also prove that the application of knowledge distillation can greatly speed up the test efficiency of the model based on ensuring the performance of the model.
Keywords/Search Tags:Pre-training model, Web API recommendation, Mashup, Quality-aware, Knowledge distillation, DistillBERT
PDF Full Text Request
Related items