KaLi 信息收集 maltogo 使用

maltogo是一款自动化信息收集工具,比较强大,网络拓扑图、ip、e-mail等等信息、还提供了很多插件,目前没有用到过插件功能,注册的话,要翻墙去注册,https://www.paterva.com/web7/buy/maltego-clients/maltego-ce.php,我们使用的是免费版CE,需要在这里点击registration(https://www.paterva.com/web7/community/community.php)去注册账户,我以前注册过了,忘记的话,通过注册页面的Reset Password重设密码,如下:

maltogo-register
maltogo-register

没有注册过的,翻墙通过第一个面板去注册个账户好了,注册好后,在kali中打开直接输入登陆账户和密码好了,验证账户密码不需要翻墙。这里有一篇介绍的文章:http://wifibeta.com/2012-03/thread-675-1-1.html,不过介绍的还是比较少、、、安装成功后,如下:

maltogo -welcome
maltogo -welcome

先介绍菜单:
investigate # 情报搜集面板
magage #管理面板,主要是升级或者修改默认配置信息
view # 显示视图,主要是显示相关的配置
organize # 主要是结果拓扑图的显示方式
machines # 项目策略,也可以通过向导方式建立
collaboration # 与windows系统协同使用

基础的使用到的是第一个菜单investigate,上下左右充满各种菜单、功能是太多。

点左上角的 + 号,新建项目,然后在空白的面板双击或右击,会弹出 start a Machine向导,会有10多个选项要求选择:
company Stalker # 搜集公司信息
find wikipedia edits # 搜集维基百科编辑的信息
footprint L* # *代表1-3,搜集的level的等级
footprint XXL #使用本机内置的配置
person email address #搜集email和ip信息
prune? leaf entities # 左侧面板选项自定义,默认不选定
twitter digger X # 通过twitter搜集信息
twitter digger Y # 通过twitter搜集信息,扩展附属分析结果
twitter monitor # 监控twitter
URL To Network And Domain Information #通过网址搜集网络和域名信息

然后下面还有两个复选按钮:

Show on startup # 启动时显示
Show on empty graph click #控制面板为空时显示

一般我选第二个,启动的时候就不用弹出来了。至于扫描选项的话,目前我是选的footprint L1,还有左侧的面板十分强大,右面的面板可以更改项目的一些属性,和vb6.0的IDE环境有点像、底部的是运行日志、可以设置日志显示的字段、
 

KaLi DNS分析 urlcrazy 使用

urlcrazy是用来检查类似的域名是否存在比如 0535code.com, 0535coder.com ….,目的是查找是否有相似的域名做钓鱼、欺诈用途

URLCrazy version 0.5
by Andrew Horton (urbanadventurer)

URLCrazy

Generate and test domain typos and variations to detect and perform typo squatting, URL hijacking,
phishing, and corporate espionage.

Supports the following domain variations:
Character omission, character repeat, adjacent character swap, adjacent character replacement, double
character replacement, adjacent character insertion, missing dot, strip dashes, singular or pluralise,
common misspellings, vowel swaps, homophones, bit flipping (cosmic rays), homoglyphs, wrong top level
domain, and wrong second level domain.

Usage: /usr/bin/urlcrazy [options] domain

Options
-k, –keyboard=LAYOUT?? ?Options are: qwerty, azerty, qwertz, dvorak (default: qwerty)
-p, –popularity?? ?Check domain popularity with Google # 通过谷歌检查
-r, –no-resolve?? ?Do not resolve DNS #不检查dns
-i, –show-invalid?? ?Show invalid domain names #显示无效的域名
-f, –format=TYPE?? ?Human readable or CSV (default: human readable) # 格式化类型
-o, –output=FILE?? ?Output file # 输出文件
-h, –help?? ??? ?This help
-v, –version? ??? ?Print version information. This version is 0.5

使用如下:

urlcrazy 0535code.com?? |?? urlcrazy -p -r 0535code.com

root@kali:~# urlcrazy -p -r 0535code.com
/usr/share/urlcrazy/tld.rb:81: warning: duplicated key at line 81 ignored: “2nd_level_registration”
/usr/share/urlcrazy/tld.rb:89: warning: duplicated key at line 89 ignored: “2nd_level_registration”
/usr/share/urlcrazy/tld.rb:91: warning: duplicated key at line 91 ignored: “2nd_level_registration”
URLCrazy Domain Report
Domain??? : 0535code.com
Keyboard? : qwerty
At??????? : 2016-11-01 17:51:34 +0800

# Please wait. 110 hostnames to process

Typo Type????????????? Typo???????????? Pop? CC-A? Extn
———————————————————
Character Omission???? 035code.com?????????? ????? com
Character Omission???? 0535cde.com?????????? ????? com
Character Omission???? 0535cod.com?????????? ????? com
Character Omission???? 0535code.cm?????????? ????? cm
Character Omission???? 0535coe.com?????????? ????? com
Character Omission???? 0535ode.com?????????? ????? com
Character Omission???? 053code.com?????????? ????? com
Character Omission???? 055code.com?????????? ????? com
Character Repeat?????? 00535code.com???????? ????? com
Character Repeat?????? 05335code.com???????? ????? com
Character Repeat?????? 05355code.com???????? ????? com
Character Repeat?????? 0535ccode.com???????? ????? com
Character Repeat?????? 0535codde.com???????? ????? com
Character Repeat?????? 0535codee.com???????? ????? com
Character Repeat?????? 0535coode.com???????? ????? com
Character Repeat?????? 05535code.com???????? ????? com
Character Swap???????? 0355code.com????????? ????? com
Character Swap???????? 0535cdoe.com????????? ????? com
Character Swap???????? 0535coed.com????????? ????? com
Character Swap???????? 0535ocde.com????????? ????? com
Character Swap???????? 053c5ode.com????????? ????? com
Character Swap???????? 0553code.com????????? ????? com
Character Swap???????? 5035code.com????????? ????? com
Character Replacement? -535code.com????????? ????? com
Character Replacement? 0435code.com????????? ????? com
Character Replacement? 0525code.com????????? ????? com
Character Replacement? 0534code.com????????? ????? com
Character Replacement? 0535cide.com????????? ????? com
Character Replacement? 0535codr.com????????? ????? com
Character Replacement? 0535codw.com????????? ????? com
Character Replacement? 0535cofe.com????????? ????? com
Character Replacement? 0535cose.com????????? ????? com
Character Replacement? 0535cpde.com????????? ????? com
Character Replacement? 0535vode.com????????? ????? com
Character Replacement? 0535xode.com????????? ????? com
Character Replacement? 0536code.com????????? ????? com
Character Replacement? 0545code.com????????? ????? com
Character Replacement? 0635code.com????????? ????? com
Character Replacement? 9535code.com????????? ????? com
Character Insertion??? 0-535code.com???????? ????? com
Character Insertion??? 05325code.com???????? ????? com
Character Insertion??? 05345code.com???????? ????? com
Character Insertion??? 05354code.com???????? ????? com
Character Insertion??? 05356code.com???????? ????? com
Character Insertion??? 0535coder.com???????? ????? com
Character Insertion??? 0535codew.com???????? ????? com
Character Insertion??? 0535codfe.com???????? ????? com
Character Insertion??? 0535codse.com???????? ????? com
Character Insertion??? 0535coide.com???????? ????? com
Character Insertion??? 0535copde.com???????? ????? com
Character Insertion??? 0535cvode.com???????? ????? com
Character Insertion??? 0535cxode.com???????? ????? com
Character Insertion??? 05435code.com???????? ????? com
Character Insertion??? 05635code.com???????? ????? com
Character Insertion??? 09535code.com???????? ????? com
Missing Dot??????????? 0535codecom.com?????? ????? com
Missing Dot??????????? www0535code.com?????? ????? com
Singular or Pluralise? 0535codes.com???????? ????? com
Vowel Swap???????????? 0535coda.com????????? ????? com
Vowel Swap???????????? 0535codi.com????????? ????? com
Vowel Swap???????????? 0535codo.com????????? ????? com
Vowel Swap???????????? 0535codu.com????????? ????? com
Homophones???????????? 0535cowed.com???????? ????? com
Bit Flipping?????????? 0135code.com????????? ????? com
Bit Flipping?????????? 0515code.com????????? ????? com
Bit Flipping?????????? 0531code.com????????? ????? com
Bit Flipping?????????? 0535aode.com????????? ????? com
Bit Flipping?????????? 0535bode.com????????? ????? com
Bit Flipping?????????? 0535cgde.com????????? ????? com
Bit Flipping?????????? 0535ckde.com????????? ????? com
Bit Flipping?????????? 0535cmde.com????????? ????? com
Bit Flipping?????????? 0535cnde.com????????? ????? com
Bit Flipping?????????? 0535codd.com????????? ????? com
Bit Flipping?????????? 0535codg.com????????? ????? com
Bit Flipping?????????? 0535codm.com????????? ????? com
Bit Flipping?????????? 0535coee.com????????? ????? com
Bit Flipping?????????? 0535cole.com????????? ????? com
Bit Flipping?????????? 0535cote.com????????? ????? com
Bit Flipping?????????? 0535gode.com????????? ????? com
Bit Flipping?????????? 0535kode.com????????? ????? com
Bit Flipping?????????? 0535sode.com????????? ????? com
Bit Flipping?????????? 0537code.com????????? ????? com
Bit Flipping?????????? 053ucode.com????????? ????? com
Bit Flipping?????????? 0575code.com????????? ????? com
Bit Flipping?????????? 05s5code.com????????? ????? com
Bit Flipping?????????? 0735code.com????????? ????? com
Bit Flipping?????????? 0u35code.com????????? ????? com
Bit Flipping?????????? 1535code.com????????? ????? com
Bit Flipping?????????? 2535code.com????????? ????? com
Bit Flipping?????????? 4535code.com????????? ????? com
Bit Flipping?????????? 8535code.com????????? ????? com
Bit Flipping?????????? p535code.com????????? ????? com
Homoglyphs???????????? 0535c0de.com????????? ????? com
Homoglyphs???????????? 0535cocle.com???????? ????? com
Homoglyphs???????????? o535code.com????????? ????? com
Wrong TLD????????????? 0535code.ca?????????? ????? ca
Wrong TLD????????????? 0535code.ch?????????? ????? ch
Wrong TLD????????????? 0535code.de?????????? ????? de
Wrong TLD????????????? 0535code.edu????????? ????? edu
Wrong TLD????????????? 0535code.es?????????? ????? es
Wrong TLD????????????? 0535code.fr?????????? ????? fr
Wrong TLD????????????? 0535code.it?????????? ????? it
Wrong TLD????????????? 0535code.jp?????????? ????? jp
Wrong TLD????????????? 0535code.net????????? ????? net
Wrong TLD????????????? 0535code.nl?????????? ????? nl
Wrong TLD????????????? 0535code.no?????????? ????? no
Wrong TLD????????????? 0535code.org????????? ????? org
Wrong TLD????????????? 0535code.ru?????????? ????? ru
Wrong TLD????????????? 0535code.se?????????? ????? se
Wrong TLD????????????? 0535code.us?????????? ????? us

貌似购买域名也可以通过这款工具去查也比较好噢、

KaLi DNS分析 fierce 使用

fierce.pl (C) Copywrite 2006,2007 – By RSnake at http://ha.ckers.org/fierce/

Usage: perl fierce.pl [-dns example.com] [OPTIONS]

Overview:
Fierce is a semi-lightweight scanner that helps locate non-contiguous
IP space and hostnames against specified domains.? It’s really meant
as a pre-cursor to nmap, unicornscan, nessus, nikto, etc, since all
of those require that you already know what IP space you are looking
for.? This does not perform exploitation and does not scan the whole
internet indiscriminately.? It is meant specifically to locate likely
targets both inside and outside a corporate network.? Because it uses
DNS primarily you will often find mis-configured networks that leak
internal address space. That’s especially useful in targeted malware.

Options:
-connect?? ?Attempt to make http connections to any non RFC1918
(public) addresses.? This will output the return headers but
be warned, this could take a long time against a company with
many targets, depending on network/machine lag.? I wouldn’t
recommend doing this unless it’s a small company or you have a
lot of free time on your hands (could take hours-days).
Inside the file specified the text “Host:\n” will be replaced
by the host specified. Usage:

perl fierce.pl -dns example.com -connect headers.txt

-delay?? ??? ?The number of seconds to wait between lookups. # 等待秒
-dns?? ??? ?The domain you would like scanned. # 要扫描的域
-dnsfile ??? ?Use DNS servers provided by a file (one per line) for
reverse lookups (brute force).? # 使用提供的文件 遍历查找
-dnsserver?? ?Use a particular DNS server for reverse lookups
(probably should be the DNS server of the target).? Fierce
uses your DNS server for the initial SOA query and then uses
the target’s DNS server for all additional queries by default. # 反向查找、
-file?? ??? ?A file you would like to output to be logged to. #输出文件
-fulloutput?? ?When combined with -connect this will output everything
the webserver sends back, not just the HTTP headers. # 输出所有信息
-help?? ??? ?This screen.
-nopattern?? ?Don’t use a search pattern when looking for nearby
hosts.? Instead dump everything.? This is really noisy but
is useful for finding other domains that spammers might be
using.? It will also give you lots of false positives,
especially on large domains. # 搜索主机,非常耗时
-range?? ??? ?Scan an internal IP range (must be combined with
-dnsserver).? Note, that this does not support a pattern # 扫描的ip范围
and will simply output anything it finds.? Usage:

perl fierce.pl -range 111.222.333.0-255 -dnsserver ns1.example.co

-search?? ??? ?Search list.? When fierce attempts to traverse up and
down ipspace it may encounter other servers within other
domains that may belong to the same company.? If you supply a
comma delimited list to fierce it will report anything found.
This is especially useful if the corporate servers are named # 搜索列表
different from the public facing website.? Usage:

perl fierce.pl -dns examplecompany.com -search corpcompany,blahcompany

Note that using search could also greatly expand the number of
hosts found, as it will continue to traverse once it locates
servers that you specified in your search list.? The more the
better.
-suppress?? ?Suppress all TTY output (when combined with -file).
-tcptimeout?? ?Specify a different timeout (default 10 seconds).? You
may want to increase this if the DNS server you are querying
is slow or has a lot of network lag. # 超时时间
-threads? Specify how many threads to use while scanning (default
is single threaded). #线程
-traverse?? ?Specify a number of IPs above and below whatever IP you
have found to look for nearby IPs.? Default is 5 above and
below.? Traverse will not move into other C blocks. # 移动到非ips
-version?? ?Output the version number.
-wide?? ??? ?Scan the entire class C after finding any matching
hostnames in that class C.? This generates a lot more traffic
but can uncover a lot more information. # 找C段
-wordlist?? ?Use a seperate wordlist (one word per line).? Usage: # 字典列表

perl fierce.pl -dns examplecompany.com -wordlist dictionary.txt

使用如下:
简单的方式:fierce -dns 0535code.com
通过字典方式:fierce -dns?0535code.com?-wordlist /usr/share/fierce/hosts.txt
使用dns防火墙的情况下,用fierce有可能会发现真实ip噢,比较耗时间,不过工具还好。另外建议使用简单和字典两种方式扫描,通过字典有的扫不到,比如做的泛解析的话,测试通过字典就没有扫到,而简单的方式确可以扫到真实IP。

KaLi DNS分析 dnswalk 使用

/usr/bin/dnswalk version [unknown] calling Getopt::Std::getopts (version 1.11 [paranoid]),
running under Perl version 5.22.1.

Usage: dnswalk [-OPTIONS [-MORE_OPTIONS]] [–] [PROGRAM_ARG1 …]

The following single-character options are accepted:
With arguments: -D #详细参数
Boolean (without arguments): -r -f -i -a -d -m -F -l

Options may be merged together.? — stops processing of options.
Space is not required between options and their arguments.
[Now continuing due to backward compatibility and excessive paranoia.
See ‘perldoc Getopt::Std’ about $Getopt::Std::STANDARD_HELP_VERSION.]
Usage: dnswalk domain
domain MUST end with a ‘.’

使用如下:

dnswalk 0535code.com. #域名后要加.
这款工具自动化还是比较智能的噢,信息收集就选这款工具了、不过有防火墙的情况下,只会出dns的结果、、、有dns域传送漏洞一定会出现噢、参考:http://0535code.com/article/20160624_906.shtml

KaLi DNS分析 dnstracer 使用

DNSTRACER version 1.8.1 – (c) Edwin Groothuis – http://www.mavetju.org
Usage: dnstracer [options] [host]
-c: disable local caching, default enabled #禁用本地缓存,默认启用
-C: enable negative caching, default disabled #启用负缓存,默认禁用
-o: enable overview of received answers, default disabled #启用接收的应答的概述,默认禁用
-q <querytype>: query-type to use for the DNS requests, default A #请求DNS类型,默认A记录
-r <retries>: amount of retries for DNS requests, default 3 #DNS请求重试的数量,默认为3
-s <server>: use this server for the initial request, default localhost #对于初始请求使用这个服务器,默认的localhost
If . is specified, A.ROOT-SERVERS.NET will be used. #可以自定义指定
-t <maximum timeout>: Limit time to wait per try #超时时间
-v: verbose #冗余
-S <ip address>: use this source address. #使用定义ip地址
-4: don’t query IPv6 servers #不查询IPv6服务器

使用如下:
root@kali:~# dnstracer -o 0535code.com
Tracing to 0535code.com[a] via 192.168.119.2, maximum of 3 retries
192.168.119.2 (192.168.119.2)
|\___ ns1.jiasule.net [0535code.com] (113.207.76.65) Got authoritative answer
|\___ ns1.jiasule.net [0535code.com] (180.97.158.110) Got authoritative answer
|\___ ns1.jiasule.net [0535code.com] (119.188.35.26) Got authoritative answer
|\___ ns2.jiasule.net [0535code.com] (113.207.76.78) Got authoritative answer
\___ ns2.jiasule.net [0535code.com] (180.97.158.110) (cached)

ns2.jiasule.net (113.207.76.78)???????? 0535code.com -> 42.48.109.206
ns1.jiasule.net (119.188.35.26)???????? 0535code.com -> 42.48.109.206
ns1.jiasule.net (180.97.158.110)??????? 0535code.com -> 218.75.177.200
ns1.jiasule.net (180.97.158.110)??????? 0535code.com -> 218.75.177.229
ns1.jiasule.net (113.207.76.65)???????? 0535code.com -> 42.48.109.206
###查看跟踪dns域名解析流程、
 

KaLi DNS分析 dnsrecon 使用

Version: 0.8.10
Usage: dnsrecon.py <options>

Options:
-h, –help?????????????????? Show this help message and exit.
-d, –domain????? <domain>?? Target domain. #域名
-r, –range?????? <range>??? IP range for reverse lookup brute force in formats (first-last) or in (range/bitmask). # 反向查找ip
-n, –name_server <name>???? Domain server to use. If none is given, the SOA of the target will be used.# 指定域服务器
-D, –dictionary? <file>???? Dictionary file of subdomain and hostnames to use for brute force. # 字典文件
-f?????????????????????????? Filter out of brute force domain lookup, records that resolve to the wildcard defined
IP address when saving records.
-t, –type??????? <types>??? Type of enumeration to perform: #类型
std?????? SOA, NS, A, AAAA, MX and SRV if AXRF on the NS servers fail.
rvl?????? Reverse lookup of a given CIDR or IP range.
brt?????? Brute force domains and hosts using a given dictionary.
srv?????? SRV records.
axfr????? Test all NS servers for a zone transfer.
goo?????? Perform Google search for subdomains and hosts.
snoop???? Perform cache snooping against all NS servers for a given domain, testing
all with file containing the domains, file given with -D option.
tld?????? Remove the TLD of given domain and test against all TLDs registered in IANA.
zonewalk? Perform a DNSSEC zone walk using NSEC records.
-a?????????????????????????? Perform AXFR with standard enumeration.
-s?????????????????????????? Perform a reverse lookup of IPv4 ranges in the SPF record with standard enumeration. #ip反向查找
-g?????????????????????????? Perform Google enumeration with standard enumeration. # 通过谷歌遍历
-w?????????????????????????? Perform deep whois record analysis and reverse lookup of IP ranges found through #进行深WHOIS记录分析和IP反向查找区域发现
Whois when doing a standard enumeration.
-z?????????????????????????? Performs a DNSSEC zone walk with standard enumeration.
–threads???????? <number>?? Number of threads to use in reverse lookups, forward lookups, brute force and SRV
record enumeration. #设置线程
–lifetime??????? <number>?? Time to wait for a server to response to a query. #等待服务器响应查询的时间
–db????????????? <file>???? SQLite 3 file to save found records. # SQLite 3方式保存找到的记录
–xml???????????? <file>???? XML file to save found records .# xml 方式保存找到的记录
–iw???????????????????????? Continue brute forcing a domain even if a wildcard records are discovered.
-c, –csv???????? <file>???? Comma separated value file.
-j, –json??????? <file>???? JSON file.
-v?????????????????????????? Show attempts in the brute force modes.

 
使用方式:
dnsrecon -d 0535code.com? -g #通过谷歌查找域、ip信息

KaLi DNS分析 dnsenum使用

dnsenum.pl VERSION:1.2.3
Usage: dnsenum.pl [Options] <domain>
[Options]:
Note: the brute force -f switch is obligatory.
GENERAL OPTIONS:
–dnsserver ?? ?<server>
Use this DNS server for A, NS and MX queries. #查询dns记录,NS和MX记录
–enum?? ??? ?Shortcut option equivalent to –threads 5 -s 15 -w. #快捷方式 设置线程
-h, –help?? ??? ?Print this help message.
–noreverse?? ??? ?Skip the reverse lookup operations. #跳过反向查找操作
–nocolor?? ??? ?Disable ANSIColor output. #无颜色在控制台输出
–private?? ??? ?Show and save private ips at the end of the file domain_ips.txt. #ips 显示在文件domain_ips.txt
–subfile <file>?? ?Write all valid subdomains to this file. #写入所有的有效域
-t, –timeout <value>?? ?The tcp and udp timeout values in seconds (default: 10s). #设置超时时间
–threads <value>?? ?The number of threads that will perform different queries. #设置线程
-v, –verbose?? ??? ?Be verbose: show all the progress and all the error messages. #显示所有详情
GOOGLE SCRAPING OPTIONS:
-p, –pages <value>?? ?The number of google search pages to process when scraping names,
the default is 5 pages, the -s switch must be specified. #设置谷歌页数,默认为5页
-s, –scrap <value>?? ?The maximum number of subdomains that will be scraped from Google (default 15). #设置谷歌最大页数,默认为15
BRUTE FORCE OPTIONS:
-f, –file <file>?? ?Read subdomains from this file to perform brute force. # 从文件中读取子域
-u, –update?? ?<a|g|r|z>
Update the file specified with the -f switch with valid subdomains.
a (all)?? ??? ?Update using all results.
g?? ??? ?Update using only google scraping results.
r?? ??? ?Update using only reverse lookup results.
z?? ??? ?Update using only zonetransfer results.
-r, –recursion?? ?Recursion on subdomains, brute force all discovred subdomains that have an NS record. #递归所有子域发现NS记录
WHOIS NETRANGE OPTIONS:
-d, –delay <value>?? ?The maximum value of seconds to wait between whois queries, the value is defined randomly, default: 3s. #whois查询最大时间,默认3s
-w, –whois?? ??? ?Perform the whois queries on c class network ranges. # C段 whois查询
**Warning**: this can generate very large netranges and it will take lot of time to performe reverse lookups.
REVERSE LOOKUP OPTIONS:
-e, –exclude?? ?<regexp>
Exclude PTR records that match the regexp expression from reverse lookup results, useful on invalid hostnames.#排除PTR记录,从反向查找结果表达式表达式匹配,对无效的主机名有用。
OUTPUT OPTIONS:
-o –output <file>?? ?Output in XML format. Can be imported in MagicTree (www.gremwell.com) #输出到文件

使用如下:
dnsenum 0535code.com

KaLi WEB程序 paros 使用

parosy也是一款代理工具,实现中间人攻击,原本想看苹果手机发短信的数据包,觉得应该不是http数据包,这款软件貌似只能抓http协议的,socker就不支持了,只能用wireshark了。用parosy试了下,果然不行,和burpsuite的几个模块有点像,kali2.0既然新增了他,一定在哪方面有他的优势。使用方式,首先找到 option->local proxy 进去设置好代理ip和端口,然后就可以正常工作了,如下:
kali-parosy
使用parosy抓苹果手机的包,多了一个ip,此时ios正在更新应用,可能是ios的应用,发短信 imessage 提示发送失败,回头用wireshark抓包再研究咯。

KaLi WEB程序 HTTrack 使用

HTTrack 这个工具以前一直在win下用,用过几个爬虫软件,最后就觉得HTTrack好,爬的相当全,当然不适合定制开发,爬取网站页面本地化,此乃神奇。win下使用很简单,填上网址,下一步下一步就好了。如下:
windows-httrack
kali下httrack需要配置很多参数,说明如下:

HTTrack version 3.48-24
usage: httrack <URLs> [-option] [+<URL_FILTER>] [-<URL_FILTER>] [+<mime:MIME_FILTER>] [-<mime:MIME_FILTER>]
with options listed below: (* is the default value)

General options:
O? path for mirror/logfiles+cache (-O path_mirror[,path_cache_and_logfiles]) (–path <param>)

Action options:
w *mirror web sites (–mirror)
W? mirror web sites, semi-automatic (asks questions) (–mirror-wizard)
g? just get files (saved in the current directory) (–get-files)
i? continue an interrupted mirror using the cache (–continue)
Y?? mirror ALL links located in the first level pages (mirror links) (–mirrorlinks)

Proxy options:
P? proxy use (-P proxy:port or -P user:pass@proxy:port) (–proxy <param>)
%f *use proxy for ftp (f0 don’t use) (–httpproxy-ftp[=N])
%b? use this local hostname to make/send requests (-%b hostname) (–bind <param>)

Limits options:
rN set the mirror depth to N (* r9999) (–depth[=N])
%eN set the external links depth to N (* %e0) (–ext-depth[=N])
mN maximum file length for a non-html file (–max-files[=N])
mN,N2 maximum file length for non html (N) and html (N2)
MN maximum overall size that can be uploaded/scanned (–max-size[=N])
EN maximum mirror time in seconds (60=1 minute, 3600=1 hour) (–max-time[=N])
AN maximum transfer rate in bytes/seconds (1000=1KB/s max) (–max-rate[=N])
%cN maximum number of connections/seconds (*%c10) (–connection-per-second[=N])
GN pause transfer if N bytes reached, and wait until lock file is deleted (–max-pause[=N])

Flow control:
cN number of multiple connections (*c8) (–sockets[=N])
TN timeout, number of seconds after a non-responding link is shutdown (–timeout[=N])
RN number of retries, in case of timeout or non-fatal errors (*R1) (–retries[=N])
JN traffic jam control, minimum transfert rate (bytes/seconds) tolerated for a link (–min-rate[=N])
HN host is abandonned if: 0=never, 1=timeout, 2=slow, 3=timeout or slow (–host-control[=N])

Links options:
%P *extended parsing, attempt to parse all links, even in unknown tags or Javascript (%P0 don’t use) (–extended-parsing[=N])
n? get non-html files ‘near’ an html file (ex: an image located outside) (–near)
t? test all URLs (even forbidden ones) (–test)
%L <file> add all URL located in this text file (one URL per line) (–list <param>)
%S <file> add all scan rules located in this text file (one scan rule per line) (–urllist <param>)

Build options:
NN structure type (0 *original structure, 1+: see below) (–structure[=N])
or user defined structure (-N “%h%p/%n%q.%t”)
%N? delayed type check, don’t make any link test but wait for files download to start instead (experimental) (%N0 don’t use, %N1 use for unknown extensions, * %N2 always use)
%D? cached delayed type check, don’t wait for remote type during updates, to speedup them (%D0 wait, * %D1 don’t wait) (–cached-delayed-type-check)
%M? generate a RFC MIME-encapsulated full-archive (.mht) (–mime-html)
LN long names (L1 *long names / L0 8-3 conversion / L2 ISO9660 compatible) (–long-names[=N])
KN keep original links (e.g. http://www.adr/link) (K0 *relative link, K absolute links, K4 original links, K3 absolute URI links, K5 transparent proxy link) (–keep-links[=N])
x? replace external html links by error pages (–replace-external)
%x? do not include any password for external password protected websites (%x0 include) (–disable-passwords)
%q *include query string for local files (useless, for information purpose only) (%q0 don’t include) (–include-query-string)
o *generate output html file in case of error (404..) (o0 don’t generate) (–generate-errors)
X *purge old files after update (X0 keep delete) (–purge-old[=N])
%p? preserve html files ‘as is’ (identical to ‘-K4 -%F “”‘) (–preserve)
%T? links conversion to UTF-8 (–utf8-conversion)

Spider options:
bN accept cookies in cookies.txt (0=do not accept,* 1=accept) (–cookies[=N])
u? check document type if unknown (cgi,asp..) (u0 don’t check, * u1 check but /, u2 check always) (–check-type[=N])
j *parse Java Classes (j0 don’t parse, bitmask: |1 parse default, |2 don’t parse .class |4 don’t parse .js |8 don’t be aggressive) (–parse-java[=N])
sN follow robots.txt and meta robots tags (0=never,1=sometimes,* 2=always, 3=always (even strict rules)) (–robots[=N])
%h? force HTTP/1.0 requests (reduce update features, only for old servers or proxies) (–http-10)
%k? use keep-alive if possible, greately reducing latency for small files and test requests (%k0 don’t use) (–keep-alive)
%B? tolerant requests (accept bogus responses on some servers, but not standard!) (–tolerant)
%s? update hacks: various hacks to limit re-transfers when updating (identical size, bogus response..) (–updatehack)
%u? url hacks: various hacks to limit duplicate URLs (strip //, www.foo.com==foo.com..) (–urlhack)
%A? assume that a type (cgi,asp..) is always linked with a mime type (-%A php3,cgi=text/html;dat,bin=application/x-zip) (–assume <param>)
shortcut: ‘–assume standard’ is equivalent to -%A php2 php3 php4 php cgi asp jsp pl cfm nsf=text/html
can also be used to force a specific file type: –assume foo.cgi=text/html
@iN internet protocol (0=both ipv6+ipv4, 4=ipv4 only, 6=ipv6 only) (–protocol[=N])
%w? disable a specific external mime module (-%w htsswf -%w htsjava) (–disable-module <param>)

Browser ID:
F? user-agent field sent in HTTP headers (-F “user-agent name”) (–user-agent <param>)
%R? default referer field sent in HTTP headers (–referer <param>)
%E? from email address sent in HTTP headers (–from <param>)
%F? footer string in Html code (-%F “Mirrored [from host %s [file %s [at %s]]]” (–footer <param>)
%l? preffered language (-%l “fr, en, jp, *” (–language <param>)
%a? accepted formats (-%a “text/html,image/png;q=0.9,*/*;q=0.1” (–accept <param>)
%X? additional HTTP header line (-%X “X-Magic: 42” (–headers <param>)

Log, index, cache
C? create/use a cache for updates and retries (C0 no cache,C1 cache is prioritary,* C2 test update before) (–cache[=N])
k? store all files in cache (not useful if files on disk) (–store-all-in-cache)
%n? do not re-download locally erased files (–do-not-recatch)
%v? display on screen filenames downloaded (in realtime) – * %v1 short version – %v2 full animation (–display)
Q? no log – quiet mode (–do-not-log)
q? no questions – quiet mode (–quiet)
z? log – extra infos (–extra-log)
Z? log – debug (–debug-log)
v? log on screen (–verbose)
f *log in files (–file-log)
f2 one single log file (–single-log)
I *make an index (I0 don’t make) (–index)
%i? make a top index for a project folder (* %i0 don’t make) (–build-top-index)
%I? make an searchable index for this mirror (* %I0 don’t make) (–search-index)

Expert options:
pN priority mode: (* p3) (–priority[=N])
p0 just scan, don’t save anything (for checking links)
p1 save only html files
p2 save only non html files
*p3 save all files
p7 get html files before, then treat other files
S? stay on the same directory (–stay-on-same-dir)
D *can only go down into subdirs (–can-go-down)
U? can only go to upper directories (–can-go-up)
B? can both go up&down into the directory structure (–can-go-up-and-down)
a *stay on the same address (–stay-on-same-address)
d? stay on the same principal domain (–stay-on-same-domain)
l? stay on the same TLD (eg: .com) (–stay-on-same-tld)
e? go everywhere on the web (–go-everywhere)
%H? debug HTTP headers in logfile (–debug-headers)

Guru options: (do NOT use if possible)
#X *use optimized engine (limited memory boundary checks) (–fast-engine)
#0? filter test (-#0 ‘*.gif’ ‘www.bar.com/foo.gif’) (–debug-testfilters <param>)
#1? simplify test (-#1 ./foo/bar/../foobar)
#2? type test (-#2 /foo/bar.php)
#C? cache list (-#C ‘*.com/spider*.gif’ (–debug-cache <param>)
#R? cache repair (damaged cache) (–repair-cache)
#d? debug parser (–debug-parsing)
#E? extract new.zip cache meta-data in meta.zip
#f? always flush log files (–advanced-flushlogs)
#FN maximum number of filters (–advanced-maxfilters[=N])
#h? version info (–version)
#K? scan stdin (debug) (–debug-scanstdin)
#L? maximum number of links (-#L1000000) (–advanced-maxlinks[=N])
#p? display ugly progress information (–advanced-progressinfo)
#P? catch URL (–catch-url)
#R? old FTP routines (debug) (–repair-cache)
#T? generate transfer ops. log every minutes (–debug-xfrstats)
#u? wait time (–advanced-wait)
#Z? generate transfer rate statictics every minutes (–debug-ratestats)

Dangerous options: (do NOT use unless you exactly know what you are doing)
%!? bypass built-in security limits aimed to avoid bandwidth abuses (bandwidth, simultaneous connections) (–disable-security-limits)
IMPORTANT NOTE: DANGEROUS OPTION, ONLY SUITABLE FOR EXPERTS
USE IT WITH EXTREME CARE

Command-line specific options:
V execute system command after each files ($0 is the filename: -V “rm \$0”) (–userdef-cmd <param>)
%W use an external library function as a wrapper (-%W myfoo.so[,myparameters]) (–callback <param>)

Details: Option N
N0 Site-structure (default)
N1 HTML in web/, images/other files in web/images/
N2 HTML in web/HTML, images/other in web/images
N3 HTML in web/,? images/other in web/
N4 HTML in web/, images/other in web/xxx, where xxx is the file extension (all gif will be placed onto web/gif, for example)
N5 Images/other in web/xxx and HTML in web/HTML
N99 All files in web/, with random names (gadget !)
N100 Site-structure, without www.domain.xxx/
N101 Identical to N1 exept that “web” is replaced by the site’s name
N102 Identical to N2 exept that “web” is replaced by the site’s name
N103 Identical to N3 exept that “web” is replaced by the site’s name
N104 Identical to N4 exept that “web” is replaced by the site’s name
N105 Identical to N5 exept that “web” is replaced by the site’s name
N199 Identical to N99 exept that “web” is replaced by the site’s name
N1001 Identical to N1 exept that there is no “web” directory
N1002 Identical to N2 exept that there is no “web” directory
N1003 Identical to N3 exept that there is no “web” directory (option set for g option)
N1004 Identical to N4 exept that there is no “web” directory
N1005 Identical to N5 exept that there is no “web” directory
N1099 Identical to N99 exept that there is no “web” directory
Details: User-defined option N
‘%n’ Name of file without file type (ex: image)
‘%N’ Name of file, including file type (ex: image.gif)
‘%t’ File type (ex: gif)
‘%p’ Path [without ending /] (ex: /someimages)
‘%h’ Host name (ex: www.someweb.com)
‘%M’ URL MD5 (128 bits, 32 ascii bytes)
‘%Q’ query string MD5 (128 bits, 32 ascii bytes)
‘%k’ full query string
‘%r’ protocol name (ex: http)
‘%q’ small query string MD5 (16 bits, 4 ascii bytes)
‘%s?’ Short name version (ex: %sN)
‘%[param]’ param variable in query string
‘%[param:before:after:empty:notfound]’ advanced variable extraction
Details: User-defined option N and advanced variable extraction
%[param:before:after:empty:notfound]
param : parameter name
before : string to prepend if the parameter was found
after : string to append if the parameter was found
notfound : string replacement if the parameter could not be found
empty : string replacement if the parameter was empty
all fields, except the first one (the parameter name), can be empty

Details: Option K
K0? foo.cgi?q=45? ->? foo4B54.html?q=45 (relative URI, default)
K???????????????? ->? http://www.foobar.com/folder/foo.cgi?q=45 (absolute URL) (–keep-links[=N])
K3??????????????? ->? /folder/foo.cgi?q=45 (absolute URI)
K4??????????????? ->? foo.cgi?q=45 (original URL)
K5??????????????? ->? http://www.foobar.com/folder/foo4B54.html?q=45 (transparent proxy URL)

Shortcuts:
–mirror????? <URLs> *make a mirror of site(s) (default)
–get???????? <URLs>? get the files indicated, do not seek other URLs (-qg)
–list?? <text file>? add all URL located in this text file (-%L)
–mirrorlinks <URLs>? mirror all links in 1st level pages (-Y)
–testlinks?? <URLs>? test links in pages (-r1p0C0I0t)
–spider????? <URLs>? spider site(s), to test links: reports Errors & Warnings (-p0C0I0t)
–testsite??? <URLs>? identical to –spider
–skeleton??? <URLs>? make a mirror, but gets only html files (-p1)
–update????????????? update a mirror, without confirmation (-iC2)
–continue??????????? continue a mirror, without confirmation (-iC1)

–catchurl??????????? create a temporary proxy to capture an URL or a form post URL
–clean?????????????? erase cache & log files

–http10????????????? force http/1.0 requests (-%h)

Details: Option %W: External callbacks prototypes
see htsdefines.h

example: httrack www.someweb.com/bob/
means:?? mirror site www.someweb.com/bob/ and only this site

example: httrack www.someweb.com/bob/ www.anothertest.com/mike/ +*.com/*.jpg -mime:application/*
means:?? mirror the two sites together (with shared links) and accept any .jpg files on .com sites

example: httrack www.someweb.com/bob/bobby.html +* -r6
means get all files starting from bobby.html, with 6 link-depth, and possibility of going everywhere on the web

example: httrack www.someweb.com/bob/bobby.html –spider -P proxy.myhost.com:8080
runs the spider on www.someweb.com/bob/bobby.html using a proxy

example: httrack –update
updates a mirror in the current folder

example: httrack
will bring you to the interactive mode

example: httrack –continue
continues a mirror in the current folder

HTTrack version 3.48-24
Copyright (C) 1998-2016 Xavier Roche and other contributors

功能是很多很多的噢、上面的example有使用方法,案例。试用一下:

root@0535code:~# httrack http://0535code.com/index.php
WARNING! You are running this program as root!
It might be a good idea to run as a different user
Mirror launched on Tue, 18 Oct 2016 22:26:24 by HTTrack Website Copier/3.48-24 [XR&CO’2014]
mirroring http://0535code.com/index.php with the wizard help..
^C0535code.com/tag/js (23999 bytes) – OK

然后会在当前工作目录自动生成0535code.com文件夹噢。爬取完毕后,打开0535code.com里面的首文件即可。
 

KaLi Exploit Database searchsploit 查询漏洞

Usage: searchsploit [options] term1 [term2] … [termN]
Example: searchsploit oracle windows local

=======
Options
=======

-c??????????????? Perform case-sensitive searches; by default, searches will
try to be greedy
-h, –help??? Show help screen
-v??????????????? By setting verbose output, description lines are allowed to
overflow their columns

*NOTES*
Use any number of search terms you would like (minimum of one).
Search terms are not case sensitive, and order is irrelevant.

使用一下:
root@0535coder:~# searchsploit dede
Description??????????????????????????????????? Path
——————————————— ———————————-
DedeCMS 5.1 – SQL Injection????????????????? | /php/webapps/9876.txt
Dede CMS All Versions SQL Injection Vulnerab | /php/webapps/18292.txt
DeDeCMS 5.5 ‘_SESSION[dede_admin_id]’ Parame | /php/webapps/33685.html

很强大噢、都是EXP,攻击数据库里面保存的攻击细节。

root@0535coder:~# locate searchsploit
/usr/bin/searchsploit
/usr/share/applications/kali-searchsploit.desktop
/usr/share/exploitdb/searchsploit
/usr/share/kali-menu/applications/kali-searchsploit.desktop
/usr/share/man/man1/searchsploit.1.gz
exp存放目录为:/usr/share/exploitdb/platforms/
exp数据目录为:/usr/share/exploitdb/Files.csv
貌似默认的版本比较旧了,要升级下:
Exploit-Database官方GitHub仓库:https://github.com/offensive-security/exploit-database
把searchsploit、files.csv、platforms复制进 /usr/share/exploitdb/目录即可。
 
补充一个360的开源漏洞库:
http://webscan.360.cn/vul 在这里也可以查开源程序的漏洞,只是没有漏洞细节,要自己去查找漏洞噢、至少知道用的开源程序都存在那些安全漏洞。有多少高危漏洞。