Font Size: a A A

Competitive Law Analysis Of Search Engine Robots Exclusion Protocol

Posted on:2018-11-16Degree:MasterType:Thesis
Country:ChinaCandidate:M H XiaoFull Text:PDF
GTID:2346330536975755Subject:Law
Abstract/Summary:PDF Full Text Request
Robot Exclusion Protocol,which is also known as the Robots protocol and Crawler rules,refers to the network service providers can set up specific electronic documents to the search engine to show what they can capture for,and the search engine can read the file to identify whether the page is allowed to capture for.However,because of the non-mandatory of Robot Exclusion Protocol,it's very easy for web crawler of search engine to bypass this file to achieve the purpose of information capture.In 2012,the unfair competition case of Baidu vs.Qihoo(360 so)which is based on Robot Exclusion Protocol brings a series of Internet related legal issues into the public view,and since then the relevant cases emerge.Because of the lack of direct and explicit provisions on the application of the existing law to Robot Exclusion Protocol in China,and because of the disputes and differences in the results of judicial practice among scholars,it's urgent to solve the problem of legal regulation.In this paper,the author argues that the Robots protocol has the characteristics of technical means,but its legal nature cannot be ignored.In particular,The signing of Internet Search Engine Service Self-discipline Convention in our country has made the Robots protocol becoming one of the most important industry practices in the industry.As the core of the triangle relationship among Internet users,search engine service providers and Internet content providers,the Robots protocol and the relevant norms should be supported and observed by each participant in the internet.From the perspective of anti unfair competition,business ethics represented by industry standards is often considered as an important factor in determining unfair competition behavior.In fact,it's a kind of universal business ethics in the industry that search engines can comply with the Robots protocol in the process of grasping other website information,and it may result in unfair competition behavior once violated.At the same time,the network content service providers have the right to set up the relevant content of the Robots protocol freely,but they still should be constrained by the Internet interoperability principle.In addition,in the current situation of China's Anti Unfair Competition Law facing new and old alternation and renewal,Legislators should also be based on reality and broaden their horizons,and strengthening the judicial authorities to identify the various types of anti unfair competition behavior,and constructing the legal system of our country with more standard and more perfect way.From the perspective of antitrust,although there is competition among search engine service providers,they should also respect and adapt to competition.Only in the competition to enhance their own strength and optimize their service,they can achieve breakthrough and growth.the Robots protocol was originally established in order to safeguard the security and stability of Internet information,and it should not be a tool for the market dominator to limit or exclude competition.
Keywords/Search Tags:search engine, robots exclusion protocol, unfair competition, monopolistic behavior
PDF Full Text Request
Related items