web-sorrow - A Remote Web Scanner for Misconfiguration, Version Detection, and Server Enumeration Written in Perl - Wsorrow.pl

Printer-friendly versionPDF version
web-sorrow

Wsorrow.pl is a remote web scanner. It can scan for web server misconfiguration, version detection, enumeration, and server information. Written in perl, this script will run out-of-box on most Linux systems, including Ubuntu and openSUSE. While there are many other scripts and programs that perform similar scans, web-sorrow is versatile and useful enough to be included in your toolbox.

NOTE: Web-Sorrow has now been updated to 1.4.9, so be sure to try out the latest version!

Some of the more notable features of web-sorrow are:

  • CMS (Content Management System) detection
  • Port scanning
  • Login page scanning
  • Proxy support
  • Error bagging
  • Standard tests (see below for full list)

To download the latest version (1.2.7) and unzip it, go here, or run these commands (If your distro doesn't come with unzip, you may need to install that package):

wget http://web-sorrow.googlecode.com/files/Wsorrow_v1.2.7.zip

unzip Wsorrow_v1.2.7.zip

To run the script with no switches, run this command:

perl ./Wsorrow.pl

+ web sorrow 1.2.7 Version detection, misconfig, and enumeration tool

usage:
-host [host] - Defines host to scan.
-proxy [ip:port] - use a proxy server [not on -Ps]
-S - Standard misconfig and other checks
-Ps - Scans ports 1-100 with tcp probes
-Eb - Error Begging. Sometimes a 404 page contains server info such as daemon or even the OS
-auth - Dictionary attack to find login pages [not passwords]
-cmsPlugins - check for cms plugins [outdated 2010]
-I - Find interesting strings in html [very verbose]
-Fd - look for common interesting files and dirs
-Ws - look for Web Services on host. such as hosting porvider, blogging service, favicon fingerprinting, and cms version info
-e - everything. run all scans

Example:
perl Wsorrow.pl -host scanme.nmap.org -S
perl Wsorrow.pl -host scanme.nmap.org -Eb -Ps
perl Wsorrow.pl -host 66.11.227.35 -S -Ws -I -proxy 129.255.1.17:3128

​As the examples show, there are several switches you can use. If you want to use all the switches on host scanme.nmap.org, run this command (Warning - using the -e switch can take a while):

perl ./Wsorrow.pl -host scanme.nmap.org -e

Here is a more detailed list of functionality taken directly from the code.google.com project page:

-S - stands for standard. a set of Standard tests and includes: indexing of directories testing, banner grabbing, language detection (should be obvious), robots.txt, and 200 response testing (some servers send a 200 ok for every req)

-Ps - stands for port scan. uses a tcp ping. tests ports 1-100. I will not try to compete with nmap but if you want to contribute port range functions or ICMP that is fine

-Eb - stands for error bagging. The default config for servers is to put the server daemon and version and sometimes even the OS inside of error pages. web-sorrow reqs a URl of 20 random bytes with get and post methods.

-cms - stands for Content Management System. tests the host for default cms files that contain versions info. The list of default files is small (as of v1.2.3) but it hits all the popular ones. (AS OF V1.2.7 HAS BEEN ABSORBED BY -Ws)

-auth - looks for login pages with a list of some of the most common login files and dirs. don't need to be very big list of URLs because what else are going to name it? notAlogin.php???

-cmsPlugins - run a huge list of plugins dirs for cms servers. the list is a bit old 2010

-I - searches the index for interesting strings including: facbook pages, twitter pages, emails, bank, account, etc.

-Ws - looks for web services such as hosting provider, blogging services, favicon fingerprinting, and cms version info

-Fd - look for generally things people don't want you to see. The list is generated form a TON of robot.txt so whatever it finds should be interesting.

-proxy - send all http reqs via a proxy. example: 255.255.255.254:8080

-e - run all the scans in the scanner

web-sorrow also has false positives checking on most of it's requests (it pretty accurate but not perfect).

How useful do you find this script, if useful at all? Do you know any scripts similar, but more efficient or useful? Please comment and let us know.

Post new comment

Error | http://www.itswapshop.com

Error

The website encountered an unexpected error. Please try again later.