all issues - Errors in previous version Fixed
D3vLeecher is an Automated Crawler and Scraper Tool , that uses mainly "google" as source of information by default- you can choose whatever search engine you want
- simple to use
1- choose what u want to leech from the list ( user : pass combo , email : pass combo , Url with User : pass , Search For Site - URLs , or you can define ur own Regular expression if u are familiar what does it mean)
2- Click on Leech and wait
how to Search for a specific Site ?
To Do so just Choose "Search For Site - URLs" from the List and write the full name of the site you wanna Crawl for (eg. pornpros.com) and click Leech and wait - Tool will be collecting URLs for site defined
-I used 1 search engines to leech (Google), as a default and u can add more as you want --- how to do that, got two options:
1- You can add ur search Engine List by Choosing "Engine Tab" then adding list by click on the folder Icon, or you can add a single search engine by add it to the TextBox then clicking on the the Down word Arrow
2- Or , In the Settings Folder open the Engine.txt and add the Search Engines there to use them As Default
-There is a built in Keywords For Leeching depending on your Choice , But U can add ur own Keywords that can be used by the search Engine while Leeching... again to add them ,
1- Choose "Key Tab" then click on the Folder Icon ....
2- Or in the setting folder add the keywords u want to the keys.txt File
- You have 2 options in Leeching, either to Leech only using the KeyWords assigned by you
- Or u can Use The Loop Option (by checking it) where here tool well be using the captured results as keys for further looping and looping,
and as this will take somewhat a big memo i made it limited for a total of Max 20 minutes, again u can define it by "Auto Abort After"
- You Can Choose To AutoSave Results - results will be saved in \History\Hits.txt
and finally tool will Save all leeched URLs in History Folder in \History\Urls_Leeched