| With the development of Internet technology and artificial intelligence technology,the scale of data in cyberspace has shown explosive growth,which means that today’s society has entered the era of big data.New challenges have been proposed in technologies such as traditional data processing,information storage and application.As an efficient model for information extraction,storage and management,Knowledge Graph brings new ideas for information processing and applications in big data environment.At present,the Knowledge Graph can provide knowledge services for many intelligent applications,such as intelligent retrieval,semantic understanding,automatic question and answer,chat customer service,recommendation systems,etc.These intelligent applications constantly generate new data,enrich the knowledge in the knowledge graph,and put forward new requirements for the development of knowledge graph technology at the same time.It is of great significance to study knowledge graph technology for extracting knowledge from Internet and applying knowledge to Internet applications.In this paper,knowledge graph techniques such as knowledge representation reasoning,entity linking and joint learning on entity and relation are studied by using knowledge representation learning method.Compared with traditional feature engineering methods,knowledge representation learning do not require manual design and extraction of features,but automatically learn features from data into embeddings through training models.Representation vectors can be conveniently used in various tasks and deeply combined with deep learning methods.The main achievements of this paper are as follows:1.Proposing a knowledge graph reasoning model based on entity and relation interaction represent learning: aiming at the situation that knowledge representation learning methods based on graph neural network ignore the interaction and internal connection of entity and relation,a knowledge representation learning model based on entity and relation interaction is proposed.The link prediction task in knowledge reasoning are taken as a downstream task to evaluate the model.The model adopts an encoder-decoder frame.The encoder adds entity and relation interaction modules to the graph attention network to learn entity and relation representations,and the decoder uses convolutional neural networks for link prediction.The technology is compared with the most advanced models on general datasets,FB15k-237 and WN18 RR.The experimental results show that the performance of the proposed model improve a lot.2.Proposing an entity linking algorithm based on entity representation aggregation:aiming at the problem that the existing entity link tasks are mostly based on single knowledge graph and simple feature utilization,an entity linking model based on entity representation aggregation is proposed to comprehensively utilize the structural and semantic features of entities in the knowledge bases for entity linking.The algorithm designs two strategies to aggregate disparate entity representations,such as those based on structural features and semantic features trained on Wikipedia and Wikidata,respectively.The former mainly encodes text word features,and the latter encodes relation structural features.The algorithm adopts different strategies to aggregate these two features,so as to utilize information from two knowledge bases simultaneously and link to the two knowledge bases.The algorithm is trained on the general data set AIDA-train,and compared with the state-of-art models on AIDA-test and 5 WNED test datasets.The experimental results show that the performance of the proposed model can reach the state-of-art models.3.Proposing a joint extraction and linking technology for entity and relation based on the pre-training language model: in view of the fact that each technology studies independently and ignores the internal relations between tasks leads to poor results due to error accumulation when pipelined implementing an application,different joint learning models are studied.Three joint learning models are designed for different application requirements: joint entity extraction and linking model,joint entity and relation extraction model,and joint entity and relation extraction and linking model.The three joint learning models realize joint learning by sharing pre-trained language model parameters.The models use text fragments to enumerate all possible entities(n-gram),use the language model as an encoder to learn entities,words and text representations,and use decoder to perform the downstream tasks jointly and sequentially.The models are tested on AIDA/Co NLL dataset,NYT and Web NLG dataset and T-REX dataset respectively.Experimental results show that under the same conditions,the results of joint models are better than pipeline methods,and these models can improve the performance of joint teaks. |