Skip to content

zhaijiahui/URL_collect

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 

Repository files navigation

URL_collect

用于爬取网站源码内所有连接,并对爬取结果进行分类。

#---------------------------

#URL collect by zhaijiahui

#---------------------------

-u Input your domain

-d Crawl depth

-o Save result

-s Prevent requests too fast

Usage: get_url.py -u http://www.target.com/ -d 2 get_url.py -u http://www.target.com/ -d 2 -s 2 -o

bug

About

用于收集网站源码内所有url的脚本

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages