Font Size: a A A

Detecting Techniques Of XSS Vulnerabilities Based On Web Crawler

Posted on:2015-10-25Degree:MasterType:Thesis
Country:ChinaCandidate:F F WanFull Text:PDF
GTID:2298330431998589Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the development of science, the Internet has become an inseparable part ofour life.It provides a great convenience for our life, and promotes social developmentand progress, bringing us a new era.It is widely used in commercial, education,entertainment and other fields.The Internet’s connectivity, openness, interactivitygreatly facilitated information sharing to the society, but it also brings a lot of securityrisks. Internet attacks are very easy, because the security risks mainly due to thevulnerability of network,network communication protocols’ inherent defects,software’s and network services’ vulnerabilities,network structure’s and networkhardware’s security flaws. Web applications have many vulnerabilities existing in webapplication.The proportion of cross-site scripting attacks topped the list, so how toefficiently detect vulnerabilities is a hot current of our study.The paper’s structure are listed as follows:1.A detailed description of the basic principles,classification,hazard andprevention strategy of cross-site scripting attacks.2. A detailed description of the concept of web crawler, web crawler’s strategy,web crawler’s classification,page analysis,crawler queue’s design,DNS cachingstrategy.3.Describing each module of mining technique of XSS vulnerability based onweb crawler,including crawler module,code injection module and vulnerabilitydetection module.Using Gzip algorithm compresses datas before sending in crawlermodule,and using asynchronous I/O completion port model in page downloading.When the page is parsed,using regular expression matching.when removingduplicated URL,using M-Interval-Hash algorithm;In automatically injected codemodule,testing both GET and POST requests;In vulnerability detection module,usingvulnerability feature database detect XSS vulnerabilities.4.Through comparing the M-Interval-Hash algorithm with the MD5algorithm,deriving the algorithm proposed in the paper removes duplicated URL better;In thevulnerability detection,through comparing with the other two XSS vulnerabilityscanning tools called Acunetix WVS and XSSer,proved technology in the paper candetect more XSS vulnerabilities, and taking less time when scanning.
Keywords/Search Tags:XSS, Web Crawler, URL Redupliction Removing, Vulnerability Detection
PDF Full Text Request
Related items