| With the rapid development of science and technology and the arrival of the information age,the explosive growth of data causes information redundancy and overload.How to provide users with accurate information has become a challenge for information retrieval.Entity linking is mainly aimed at semantic diversity and ambiguity in natural language.The purpose of entity linking is to accurately link entity mentions with ambiguous meanings in the context to the corresponding entities in the knowledge base.It is essential for enhancing downstream natural language processing(NLP)tasks such as text understanding,question answering,and relation extraction,etc.Most of the existing entity linking methods utilize deep neural networks to capture the latent feature between context and entity description,and calculate the similarity between mentions and entities,but ignore the relative relationship between different candidate entities.This paper proposes an entity linking model based on deep learning,based on contrastive learning,aims at zero-shot entity linking and Chinese short text entity linking,which are two challenging entity linking tasks.Most of the existing Zero-Shot entity linking methods rely on processed labeled data,which has high cost and is relatively scarce.There models are difficult to be widely applied in different fields.This paper propose a zero-shot entity linking model,using the deep learning method of contrastive learning to model the Zero-Shot problem.Our method adapts latent semantic information to new domains and achieves a new state-ofthe-art on Wikias’ zero-shot EL dataset.In Chinese short text entity link tasks,for the limited number of available data,the Chinese entity linking is more challenging.Existing Chinese short text entity link models are less,and the short text is limited and handled by the context missing and the processing noise.There is still a lot of space to improve the accuracy.This paper proposes a Chinese short text entity linking model,encoding the mention and entity representation of Pattern-Exploiting Training,and learning the potential relationship between the entities in the knowledge base,based on contrastive learning.Our Chinese short text model experiments on Duel2.0 dataset and improves the result. |