second-order: subdomain takeover scanner
Scans web applications for second-order subdomain takeover by crawling the app, and collecting URLs (and other data) that match some specific rules, or respond in a specific way.
Go version >= 1.8 is required.
go get github.com/mhmdiaa/second-order
This will download the code, compile it, and leave a second-order binary in $GOPATH/bin.
Command line options
go run second-order.go -base https://example.com -config config.json -output example.com -concurrency 10
Example configuration file included (config.json)
Headers: A map of headers that will be sent with every request.
Depth: Crawling depth.
LogCrawledURLs: If this is set to true, Second Order will log the URL of every crawled page.
LogQueries: A map of tag-attribute queries that will be searched for in crawled pages. For example,
"a": "href"means log every
hrefattribute of every
LogURLRegex: A list of regular expressions that will be matched against the URLs that are extracted using the queries in
LogQueries; if left empty, all URLs will be logged.
LogNon200Queries: A map of tag-attribute queries that will be searched for in crawled pages, and logged only if they don’t return a
ExcludedURLRegex: A list of regular expressions whose matching URLs will not be accessed by the tool.
ExcludedStatusCodes: A list of status codes; if any page responds with one of these, it will be excluded from the results of
LogNon200Queries; if left empty, all non-200 pages’ URLs will be logged.
LogInlineJS: If this is set to true, Second Order will log the contents of every
scripttag that doesn’t have a
Output Directory Structure
All results are saved in JSON files that specify what and where data was found