【批处理DOS-CMD命令-汇总和小结】-外部命令-cmd下载命令、抓包命令(wget)
一、wget下载程序的优势
1)支持断点下传功能
2)同时支持FTP和HTTP下载方式
3)支持代理服务器
4)设置方便简单
5)程序小,完全免费
二、下载和安装wget程序
wget的官网如下:Wget for Windows
不过很奇怪的是, 所有的下载链接,都出现了301错误。
于是我不得不换一个网站下载,推荐网站GNU Wget 1.21.3 for Windows
我下载了1.21.3 的64位版本zip文件包。
解压后,将这个文件夹整体移动到c盘下的system32目录下,配置环境变量,即安装完成。
三、wget命令的使用方法
3.1 wget命令的帮助信息——wget --help
如果你执行这个命令是正常的,那么就说明安装成功,如果报错——wget既不是内部命令也不是外部命令,那么说明安装失败了。
往往是你环境变量的配置有问题。
C:\Users\Administrator>wget --help
GNU Wget 1.21.3, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...Mandatory arguments to long options are mandatory for short options too.Startup:-V, --version display the version of Wget and exit-h, --help print this help-b, --background go to background after startup-e, --execute=COMMAND execute a `.wgetrc'-style commandLogging and input file:-o, --output-file=FILE log messages to FILE-a, --append-output=FILE append messages to FILE-d, --debug print lots of debugging information-q, --quiet quiet (no output)-v, --verbose be verbose (this is the default)-nv, --no-verbose turn off verboseness, without being quiet--report-speed=TYPE output bandwidth as TYPE. TYPE can be bits-i, --input-file=FILE download URLs found in local or external FILE--input-metalink=FILE download files covered in local Metalink FILE-F, --force-html treat input file as HTML-B, --base=URL resolves HTML input-file links (-i -F)relative to URL--config=FILE specify config file to use--no-config do not read any config file--rejected-log=FILE log reasons for URL rejection to FILEDownload:-t, --tries=NUMBER set number of retries to NUMBER (0 unlimits)--retry-connrefused retry even if connection is refused--retry-on-http-error=ERRORS comma-separated list of HTTP errors to retry-O, --output-document=FILE write documents to FILE-nc, --no-clobber skip downloads that would download toexisting files (overwriting them)--no-netrc don't try to obtain credentials from .netrc-c, --continue resume getting a partially-downloaded file--start-pos=OFFSET start downloading from zero-based position OFFSET--progress=TYPE select progress gauge type--show-progress display the progress bar in any verbosity mode-N, --timestamping don't re-retrieve files unless newer thanlocal--no-if-modified-since don't use conditional if-modified-since getrequests in timestamping mode--no-use-server-timestamps don't set the local file's timestamp bythe one on the server-S, --server-response print server response--spider don't download anything-T, --timeout=SECONDS set all timeout values to SECONDS--dns-servers=ADDRESSES list of DNS servers to query (comma separated)--bind-dns-address=ADDRESS bind DNS resolver to ADDRESS (hostname or IP) on local host--dns-timeout=SECS set the DNS lookup timeout to SECS--connect-timeout=SECS set the connect timeout to SECS--read-timeout=SECS set the read timeout to SECS-w, --wait=SECONDS wait SECONDS between retrievals(applies if more then 1 URL is to be retrieved)--waitretry=SECONDS wait 1..SECONDS between retries of a retrieval(applies if more then 1 URL is to be retrieved)--random-wait wait from 0.5*WAIT...1.5*WAIT secs between retrievals(applies if more then 1 URL is to be retrieved)--no-proxy explicitly turn off proxy-Q, --quota=NUMBER set retrieval quota to NUMBER--bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host--limit-rate=RATE limit download rate to RATE--no-dns-cache disable caching DNS lookups--restrict-file-names=OS restrict chars in file names to ones OS allows--ignore-case ignore case when matching files/directories-4, --inet4-only connect only to IPv4 addresses-6, --inet6-only connect only to IPv6 addresses--prefer-family=FAMILY connect first to addresses of specified family,one of IPv6, IPv4, or none--user=USER set both ftp and http user to USER--password=PASS set both ftp and http password to PASS--ask-password prompt for passwords--use-askpass=COMMAND specify credential handler for requestingusername and password. If no COMMAND isspecified the WGET_ASKPASS or the SSH_ASKPASSenvironment variable is used.--no-iri turn off IRI support--local-encoding=ENC use ENC as the local encoding for IRIs--remote-encoding=ENC use ENC as the default remote encoding--unlink remove file before clobber--keep-badhash keep files with checksum mismatch (append .badhash)--metalink-index=NUMBER Metalink application/metalink4+xml metaurl ordinal NUMBER--metalink-over-http use Metalink metadata from HTTP response headers--preferred-location preferred location for Metalink resourcesDirectories:-nd, --no-directories don't create directories-x, --force-directories force creation of directories-nH, --no-host-directories don't create host directories--protocol-directories use protocol name in directories-P, --directory-prefix=PREFIX save files to PREFIX/..--cut-dirs=NUMBER ignore NUMBER remote directory componentsHTTP options:--http-user=USER set http user to USER--http-password=PASS set http password to PASS--no-cache disallow server-cached data--default-page=NAME change the default page name (normallythis is 'index.html'.)-E, --adjust-extension save HTML/CSS documents with proper extensions--ignore-length ignore 'Content-Length' header field--header=STRING insert STRING among the headers--compression=TYPE choose compression, one of auto, gzip and none. (default: none)--max-redirect maximum redirections allowed per page--proxy-user=USER set USER as proxy username--proxy-password=PASS set PASS as proxy password--referer=URL include 'Referer: URL' header in HTTP request--save-headers save the HTTP headers to file-U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION--no-http-keep-alive disable HTTP keep-alive (persistent connections)--no-cookies don't use cookies--load-cookies=FILE load cookies from FILE before session--save-cookies=FILE save cookies to FILE after session--keep-session-cookies load and save session (non-permanent) cookies--post-data=STRING use the POST method; send STRING as the data--post-file=FILE use the POST method; send contents of FILE--method=HTTPMethod use method "HTTPMethod" in the request--body-data=STRING send STRING as data. --method MUST be set--body-file=FILE send contents of FILE. --method MUST be set--content-disposition honor the Content-Disposition header whenchoosing local file names (EXPERIMENTAL)--content-on-error output the received content on server errors--auth-no-challenge send Basic HTTP authentication informationwithout first waiting for the server'schallengeHTTPS (SSL/TLS) options:--secure-protocol=PR choose secure protocol, one of auto, SSLv2,SSLv3, TLSv1, TLSv1_1, TLSv1_2, TLSv1_3 and PFS--https-only only follow secure HTTPS links--no-check-certificate don't validate the server's certificate--certificate=FILE client certificate file--certificate-type=TYPE client certificate type, PEM or DER--private-key=FILE private key file--private-key-type=TYPE private key type, PEM or DER--ca-certificate=FILE file with the bundle of CAs--ca-directory=DIR directory where hash list of CAs is stored--crl-file=FILE file with bundle of CRLs--pinnedpubkey=FILE/HASHES Public key (PEM/DER) file, or any numberof base64 encoded sha256 hashes preceded by'sha256//' and separated by ';', to verifypeer against--random-file=FILE file with random data for seeding the SSL PRNG--ciphers=STR Set the priority string (GnuTLS) or cipher list string (OpenSSL) directly.Use with care. This option overrides --secure-protocol.The format and syntax of this string depend on the specific SSL/TLS engine.
HSTS options:--no-hsts disable HSTS--hsts-file path of HSTS database (will override default)FTP options:--ftp-user=USER set ftp user to USER--ftp-password=PASS set ftp password to PASS--no-remove-listing don't remove '.listing' files--no-glob turn off FTP file name globbing--no-passive-ftp disable the "passive" transfer mode--preserve-permissions preserve remote file permissions--retr-symlinks when recursing, get linked-to files (not dir)FTPS options:--ftps-implicit use implicit FTPS (default port is 990)--ftps-resume-ssl resume the SSL/TLS session started in the control connection whenopening a data connection--ftps-clear-data-connection cipher the control channel only; all the data will be in plaintext--ftps-fallback-to-ftp fall back to FTP if FTPS is not supported in the target server
WARC options:--warc-file=FILENAME save request/response data to a .warc.gz file--warc-header=STRING insert STRING into the warcinfo record--warc-max-size=NUMBER set maximum size of WARC files to NUMBER--warc-cdx write CDX index files--warc-dedup=FILENAME do not store records listed in this CDX file--no-warc-compression do not compress WARC files with GZIP--no-warc-digests do not calculate SHA1 digests--no-warc-keep-log do not store the log file in a WARC record--warc-tempdir=DIRECTORY location for temporary files created by theWARC writerRecursive download:-r, --recursive specify recursive download-l, --level=NUMBER maximum recursion depth (inf or 0 for infinite)--delete-after delete files locally after downloading them-k, --convert-links make links in downloaded HTML or CSS point tolocal files--convert-file-only convert the file part of the URLs only (usually known as the basename)--backups=N before writing file X, rotate up to N backup files-K, --backup-converted before converting file X, back up as X.orig-m, --mirror shortcut for -N -r -l inf --no-remove-listing-p, --page-requisites get all images, etc. needed to display HTML page--strict-comments turn on strict (SGML) handling of HTML commentsRecursive accept/reject:-A, --accept=LIST comma-separated list of accepted extensions-R, --reject=LIST comma-separated list of rejected extensions--accept-regex=REGEX regex matching accepted URLs--reject-regex=REGEX regex matching rejected URLs--regex-type=TYPE regex type (posix|pcre)-D, --domains=LIST comma-separated list of accepted domains--exclude-domains=LIST comma-separated list of rejected domains--follow-ftp follow FTP links from HTML documents--follow-tags=LIST comma-separated list of followed HTML tags--ignore-tags=LIST comma-separated list of ignored HTML tags-H, --span-hosts go to foreign hosts when recursive-L, --relative follow relative links only-I, --include-directories=LIST list of allowed directories--trust-server-names use the name specified by the redirectionURL's last component-X, --exclude-directories=LIST list of excluded directories-np, --no-parent don't ascend to the parent directoryEmail bug reports, questions, discussions to <bug-wget@gnu.org>
and/or open issues at https://savannah.gnu.org/bugs/?func=additem&group=wget.
3.2 wget命令的基本用法——wget site
假设我们要下载B站某个视频的封面图,我们可以执行命令【wget https://i2.hdslb.com/bfs/archive/75c3cff8734a76c3a671d9729eb50dbb7f7dc1c6.jpg@672w_378h_1c.webphttps://funimg.pddpic.com/ddjb/2020-09-16/804f5f88-82d4-4b3f-9cfe-06d4d172fec3.png.slim.pnghttps://i2.hdslb.com/bfs/archive/75c3cff8734a76c3a671d9729eb50dbb7f7dc1c6.jpg@672w_378h_1c.webp】。
下载成功。
在2345看图王中打开这个webp格式的图片,正常无误。
注意:不是所有图片都支持用wget命令下载,比如谷歌的logo图片链接是https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_92x30dp.png
,我们用wget命令下载的时候,发现服务器拒绝回应。
但是我们用浏览器直接访问这个链接,就是正常的。
刚刚下载的是图片,我们试一个视频【https://v26-web.douyinvod.com/0a9c9db510fa69375b8ca201d682aee5/62b59bc7/video/tos/cn/tos-cn-ve-15c001-alinc2/8d2c907ce7144fa8ad9fe39fff903e1b/?a=6383&ch=5&cr=0&dr=0&lr=all&cd=0%7C0%7C0%7C0&cv=1&br=3130&bt=3130&cs=0&ds=6&ft=5q_lc5mmnPKe2N4EqM9wTqdlYhd.YPjcTR3-&mime_type=video_mp4&qs=0&rc=Nmg5Ojc7NDU0NDVoaGk0PEBpM2Q5Njo6Zjs7ZDMzNGkzM0A0L2MvNGA2XjYxMV5hNDViYSM1bWlwcjQwcl5gLS1kLS9zcw%3D%3D&l=20220624181002010208016205390882DC】
成功下载。
下载得到的文件有待补充扩展名。
补充扩展名后,打开,非常成功。
最后,我们尝试下载一下网页,比如百度搜索首页,让我们执行命令【wget www.baidu.com】。
下载到当前目录后,让我们打开看看?还行达到预期了。
3.3 下载日志输出至目标文件——参数-o(小写)
简版参数-o的长版是--output-file,因此假设我们想要下载百度搜索网页,那么我们可以执行命令【wget -o log.txt www.baidu.com】或者【wget --output-file=log.txt www.baidu.com】。
执行命令后,在当前目录下新增了一个日志文件、一个html文件,但是日志并没有显示在cmd窗口中了。
3.4 从文本文件中批量取出url——参数-i
简版参数-i的长版是--input-file,因此假设我们想要下载一个批量的资源,并且这些资源的url在一个文本文件中存储,那么我们可以执行命令【wget -i url.txt】或者【wget --input-file=url.txt】。
新建一个txt文件,里面包括了两行url。
执行命令 【wget -i url.txt】,成功下载。
3.5 下载并重命名——参数-O(大写)
从上面几个小节,我们容易得出“如果不主动给下载的资源文件命名,那么程序一般会自动加html扩展名或者不加”。
因此,为了省掉后续在资源管理器中重命名的麻烦,我们可以直接在执行下载命令时就做好这一操作。
简版参数-O的长版是--output-document,如果你想要下载并重命名,可以执行命令【wget -O filename url】或者【wget --output-document=filename url】。
执行命令【wget -O baidu.html www.baidu.com】。
如果你想存在指定目录,并且对文件重命名,那么你可以写成【wget -O filepath url】的形式,其中filepath中最后一段应该是一个文件名。
可见,-O参数不仅可以重命名,也可以实现后面提到的参数-P的功能。
3.6 下载至指定文件夹——参数-P(大写)
简版参数-P的长版是--directory-prefix,因此假设我们将资源下载到指定文件夹,那么我们可以执行命令【wget -O directory url】或者【wget --directory-prefix=directory url】。
将下载的资源存到目录webpage(可以是未存在的目录)下,执行命令【wget -P webpage www.baidu.com】。
3.7 递归下载整个网站——参数-r
简版参数-r的长版是--recursive,因此假设我们想要下载对应网站的全部资源,那么我们可以执行命令【wget -r url】或者【wget --recursive url】。
不妨试试对CSDN的网站进行爬虫,因为下载的资源都会保存至以url命名的文件夹中,所以执行命令【wget -r www.csdn.net】后也不需担心各种文件下载后会和其他文件混乱。
执行完后,我们发现只下载了两个文件,其中有一个名称为robot.txt的文件,说明CSDN网站存在反爬虫的机制,毕竟是商业大站,我认怂...
我们试试爬一些小站,比如说北京某知名高校的官网(在这里温馨提示一下,不要乱搞,你如果没爬出问题还好,爬出问题来了,惹祸了,小心高校到时候给你发律师函,所以不要持续地爬,试试就行了)。
执行命令【wget -r www.tsinghua.edu.cn】,然后我们可以看到cmd窗口一直在滚动,说明这个网站没有反爬虫机制。
但是出于不想太“刑”的考量,我还是按下ctrl+c键停止了爬虫,下载到的资源文件如下图所示。
四、对Cmd命令的小结和反思
在上面用wget命令对参数-i进行实验时,我发现执行命令【wget -i=url.txt】会找不到文件。
而后,经过一段时间,我看了网上其他文章对这种参数有简版、也有长版形式的情况,参数值最好是与参数用空格隔开。
因为对于简版形式,参数和参数值不能用等号隔开,但是可以用空格隔开;而对于长版形式,参数和参数值之间可以用等号或者空格隔开。
如果采用了不合适的语法,就会导致参数值被错误地赋予给参数。
比如执行命令【wget -o=log.txt www.baidu.com】后,生成的日志文件名称不叫“log.txt”,而叫“=log.txt”。
【批处理DOS-CMD命令-汇总和小结】-外部命令-cmd下载命令、抓包命令(wget)相关推荐
- 【批处理DOS-CMD命令-汇总和小结】-跳转、循环、条件命令(goto、errorlevel、if、for[读取、切分、提取字符串]、)cmd命令错误汇总,cmd错误
一.本文摘要 此文主要研究对代码分支化执行和重复利用的实现. 分支化执行指根据中途的实际执行结果决定下一步执行的代码,跳转的代码行号:分支化执行大概分为跳转执行.条件判断执行:因此,分支化执行基本是只 ...
- 【批处理DOS-CMD命令-汇总和小结】-文件(夹)属性更改命令(attrib)
一.打印attrib的帮助信息 执行命令[attrib /?],显示帮助信息 C:\Users\Administrator>attrib /? 显示或更改文件属性.ATTRIB [+R | -R ...
- 【批处理DOS-CMD命令-汇总和小结】-CMD窗口的设置与操作命令(cd、title、mode、color、pause、chcp、exit、goto :eof)
一. 改变cmd窗口当前目录--资源管理器或cd 具体方法详见我的专栏中另一篇文章的章节1.5. [批处理DOS-CMD命令-汇总和小结]-文件与目录操作命令(md.rd.xcopy.dir.cd.s ...
- 【批处理DOS-CMD命令-汇总和小结】-将文件夹映射成虚拟磁盘——subst
一.subst命令的意义 有时候文件(夹)的目录很深,每一次打开的时候需要一直切换目录,有了镜像虚拟磁盘subst命令,临时地将文件夹映射成磁盘显示在我的电脑中,就会让工作变得更轻松,可以节约很多时间 ...
- tcpdump显示udp包_TCPdump抓包命令详解
TCPdump抓包命令 tcpdump是一个用于截取网络分组,并输出分组内容的工具.tcpdump凭借强大的功能和灵活的截取策略,使其成为类UNIX系统下用于网络分析和问题排查的首选工具. tcpdu ...
- linux下最全抓包命令使用方式学习和拓展
为什么要抓包?抓包有什么作用? 抓包的好处: 1,分析出当前服务器存在的漏洞,接口参数,防盗链,流量工具,ip伪造,参数篡改,钓鱼网站等. 抓包的作用:端到端联调,包括不限制语言的参数请求,只要走up ...
- linux tcpdump抓包命令使用详解
一.抓包命令概述 作用:(1)捕获网络协议包 (2)分析网络协议包 分类:(1)linux命令行工具,如tcpdump(2)windows图像界面工具,wireshark (3)手机抓包工具,Fi ...
- Linux上的抓包命令
命令:tcpdump -i em1 port 端口(2775) -vv -w 包名(aa1.cap) PS:tcpdump是一个用于截取网络分组,并输出分组内容的工具,简单说就是数据包抓包工具.tcp ...
- Android之用tcpdump常用抓包命令使用总结
1.搞好Android手机抓包环境 1 手机需要root 2 把tcpdump工具 push到手机 /data/local 目录下去,至于怎么搞,读者百度. 2 ...
- linux抓网卡数据包命令,Linux抓包命令tcpdump命令图解
原标题:Linux抓包命令tcpdump命令图解 tcpdump命令-->用来将网络中传送的数据包的"头"完全截获下来提供分析,常见的有Wireshark.在Linux中输入 ...
最新文章
- linux中断共享程序实现,如何在非实时linux上实现实时应用程序与内核模块之间共享存储器...
- 千锋python全套视频教程400全集-千锋出品全套python视频教程,400大全集,你了解吗?...
- switch注意事项和细节讨论
- ubuntu14.04不能安全卸载移动硬盘
- content add tpl.php,phpcms后台批量上传添加图片文章方法详解(一)
- AR热度不在? NO! 三星开发者大会将聚焦AR
- thinkphp中如何使用PHP函数,如何在ThinkPHP中使用函数进行回调
- python open函数_精选2个小例子,带你快速入门Python文件处理
- php防撞库,叉车防撞预警系统的必要性
- 斯坦福 CS224n 中文笔记整理活动 | ApacheCN
- mongo报错WiredTiger.wt, connection: /data/db/WiredTiger.wt: handle-open: open: Operation not permitted
- PAT 乙级 1005. 继续(3n+1)猜想 (25) Java版
- 基于DenseNet和自注意机制融合的脐橙病虫害鉴定(DenseNet加入注意力+自然数据集扩大)
- docker制作python项目镜像
- 在网页设计中如何排版
- LINUX中DNS服务器反向解析
- Elasticsearch与最新的log4j2零日漏洞
- 清除APP 数据的时候出现Crash的情况分析
- 极光小课堂 | PostCss浅析之词法分析
- Win10如何自定义右键菜单-修改注册表(图文)