Greaper is a multi-purpose web security scanner that performs various security checks on websites or lists of URLs. It includes options for checking status codes, scanning for vulnerabilities, checking content lengths, performing IP lookups, and much more.
- Status code checks (
-sc
) - Directory fuzzing (
-df
) - Subdomain enumeration (
-s
) - SQL Injection, XSS, LFI vulnerability scans (
-sqli
,-xss
,-lfi
) - IP lookup and bypass (
-ip
) - Content length checks (
-cl
) - Security header checks (
-sec
) - CORS misconfiguration checks (
-cors
) - Host header injection detection (
-hh
) - Live URL checks (
-lv
) - CVE scans by fingerprint (
-cve
) - JavaScript files scanning for sensitive info (
-info
)
- Python 3.6 or newer
- Install required Python packages from
requirements.txt
.
- Clone the repository with git clone https://github.com/algorethmpwd/greaper.git or download
greaper.py
andrequirements.txt
. - Install the dependencies listed in
requirements.txt
:pip install -r requirements.txt
Greaper allows you to scan a single URL or a list of URLs. Each scan mode is activated using a specific flag. Here are some examples:
python3 greaper.py -u <url> -sc
python3 greaper.py -l <file_with_urls.txt> -sc
python3 greaper.py -u https://example.com -sc
python3 greaper.py -u https://example.com -df
python3 greaper.py -u https://example.com -cors
python3 greaper.py -u https://example.com -sqli -p sqli_payloads.txt
python3 greaper.py -l urls.txt -sec
Option | Description |
---|---|
-u , --url |
Specify a single URL to scan. |
-l , --list |
File containing multiple URLs to scan, one URL per line. |
-sc |
Check status codes. |
-df |
Directory fuzzing for common paths or with a custom list. |
-s |
Enable subdomain enumeration. |
-sqli |
Run SQL Injection detection. Requires -p for payloads. |
-xss |
Enable XSS scan. Requires -p with payloads. |
-lfi |
Enable Local File Inclusion scan. Requires -p . |
-cors |
Scan for CORS misconfiguration. |
-hh |
Scan for Host Header Injection. |
-ip |
Perform IP lookup for the target. |
-cl |
Check content length of the target URLs. |
-lv |
Check if the target subdomains are live. |
-info |
Scan JS files for sensitive information. |
-sec |
Check security headers on the URLs. |
-cve |
Scan for CVEs based on server fingerprint. |
-p |
File containing payloads for scans like SQLi, XSS, etc. |
-o |
Output file to save results. |
- URL Format: URLs should include
http://
orhttps://
in the file specified by-l
. - Results: Results will display in the terminal and save to an output file if specified with
-o
.