AnsweredAssumed Answered

Black List Issue

Question asked by MemoryCorruption on Oct 10, 2011
Latest reply on Oct 11, 2011 by jkent

Recently, I have noticed some issues when scanning some websites that use an internal search feature such as Google Site Search. For one, the scan results generally show the site search function as being vulnerable to reflected XSS or Blind SQL.  I'm guessing this is just because whatever string the scanner injects into the field is reflected upon the results page. I understand that false positives happen, so this really isn't a problem.  Really, the issue is that the scanner gets hung up on these search functions and spits out a report with a huge number of false positives.  This stops the scanner from actually hitting the interesting pieces of the site(due to 5000 link limit / timeout).  So, I decided I would try and blacklist the search page altogether.  I ended up adding something like this to the blacklist:

 

hxxp://myscannedsite.com/search

 

After the next scan, I noticed that the scanner was still getting stuck on the search function of the website.  I do not have any whitelist entries listed in the web application.  Is there a better way to stop the scanner from scanning this portion of the website? Am I doing something wrong? Thanks for any input. 

Outcomes