Closed ZeroDot1 closed 4 years ago
Yaah notes, we don't need to use sublist3r api. Because we use the same source. Don't need to waste our time with the same results. In that case we selecting the third-party sites, so enumeration process can be optimized. More results will be obtained with less time required.
Maybe some good sources, I will consider adding that
Ask, Baidu, Bing, DNSDumpster, DNSTable, Dogpile, Exalead, Google, HackerOne, IPv4Info, Netcraft, PTRArchive, Riddler, SiteDossier, ViewDNS, Yahoo Certificates: Active pulls (optional), Censys, CertSpotter, Crtsh, Entrust, GoogleCT AlienVault, BinaryEdge, BufferOver, CIRCL, CommonCrawl, DNSDB, GitHub, HackerTarget, Mnemonic, NetworksDB, PassiveTotal, Pastebin, RADb, Robtex, SecurityTrails, ShadowServer, Shodan, Spyse (CertDB & FindSubdomains), Sublist3rAPI, TeamCymru, ThreatCrowd, Twitter, Umbrella, URLScan, VirusTotal, WhoisXML Web Archives: ArchiveIt, ArchiveToday, Arquivo, LoCArchive, OpenUKArchive, UKGovArchive, Wayback
Some interesting source code can be found via the following link: https://github.com/OWASP/Amass/tree/master/services