second-order: subdomain takeover scanner
Second Order
Scans web applications for second-order subdomain takeover by crawling the app, and collecting URLs (and other data) that match some specific rules, or respond in a specific way.
Installation
Go version >= 1.8 is required.
go get github.com/mhmdiaa/second-order
This will download the code, compile it, and leave a second-order binary in $GOPATH/bin.
Command line options
Example
go run second-order.go -base https://example.com -config config.json -output example.com -concurrency 10
Configuration File
Example configuration file included (config.json)
Headers
: A map of headers that will be sent with every request.Depth
: Crawling depth.LogCrawledURLs
: If this is set to true, Second Order will log the URL of every crawled page.LogQueries
: A map of tag-attribute queries that will be searched for in crawled pages. For example,"a": "href"
means log everyhref
attribute of everya
tag.LogURLRegex
: A list of regular expressions that will be matched against the URLs that are extracted using the queries inLogQueries
; if left empty, all URLs will be logged.LogNon200Queries
: A map of tag-attribute queries that will be searched for in crawled pages, and logged only if they don’t return a200
status code.ExcludedURLRegex
: A list of regular expressions whose matching URLs will not be accessed by the tool.ExcludedStatusCodes
: A list of status codes; if any page responds with one of these, it will be excluded from the results ofLogNon200Queries
; if left empty, all non-200 pages’ URLs will be logged.LogInlineJS
: If this is set to true, Second Order will log the contents of everyscript
tag that doesn’t have asrc
attribute.
Output Directory Structure
All results are saved in JSON files that specify what and where data was found
Source: https://github.com/mhmdiaa/