The curl command in Linux is explained in detail
- 2020-05-13 04:16:52
- OfStack
grammar
# curl [option] [url]
Common parameters:
-A/--user-agent <string> Set up the user agent to send to the server
-b/--cookie <name=string/file> cookie String or file read location
-c/--cookie-jar <file> After the operation is finished cookie Write to this file
-C/--continue-at <offset> Breakpoint continued to turn
-D/--dump-header <file> the header The information is written to this file
-e/--referer Source url
-f/--fail Not displayed when connection failed http error
-o/--output Write the output to this file
-O/--remote-name Write the output to the file, keeping the file name of the remote file
-r/--range <range> Retrieved from HTTP/1.1 or FTP Server byte range
-s/--silent Mute mode. It doesn't export anything
-T/--upload-file <file> Upload a file
-u/--user <user[:password]> Set the user and password for the server
-w/--write-out [format] What output is finished
-x/--proxy <host[:port]> Used on a given port HTTP The agent
-#/--progress-bar The progress bar shows the current transfer status
Example:
1. Basic usage
# curl http://www.linux.com
After execution, html of www.linux.com is displayed on the screen
Ps: this method is often used to test whether a server can reach a web site because linux is installed without a desktop, which means no browser
2. Save the page you visit
2.1: save using the redirect function of linux
# curl http://www.linux.com >> linux.html
2.2: web pages can be saved using curl's built-in option: -o (lower case)
$ curl -o linux.html http://www.linux.com
After the execution, the following interface will be displayed, and the display of 100% will indicate that the saving was successful
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 79684 0 79684 0 0 3437k 0 --:--:-- --:--:-- --:--:-- 7781k
2.3: you can use curl's built-in option: -O (uppercase) to save files on a web page
Note that url is specific to a file, or you won't be able to grab it
# curl -O http://www.linux.com/hello.sh
3. Test the return value of the webpage
# curl -o /dev/null -s -w %{http_code} www.linux.com
Ps: in scripts, this is a very common test to see if the site is working properly
4. Specify the proxy server and its port
Many times you need to use a proxy server to surf the Internet (such as when you use a proxy server to surf the Internet or when you are blocked by others because you are using curl others' websites). Fortunately, curl supports setting up the proxy by using the built-in option: -x
# curl -x 192.168.100.100:1080 http://www.linux.com
5, cookie
Some websites use cookie to record session information. For browsers like chrome, cookie information can be easily processed, but cookie can be easily processed in curl by adding relevant parameters
5.1: save cookie information in response of http. Built-in option: -c (in lower case)
# curl -c cookiec.txt http://www.linux.com
After execution, cookie information is saved into cookiec.txt
5.2: save header information in response of http. Built-in option: - D
# curl -D cookied.txt http://www.linux.com
After execution, cookie information is saved into cookied.txt
Note: -c (lowercase) produces cookie which is not the same as cookie in -D.
5.3: use cookie
Many websites monitor your cookie information to see if you are following the rules when visiting their site, so we need to use the saved cookie information. Built-in option: - b
# curl http://www.linux.com
0
6. Mimic the browser
Some sites require a specific browser to access them, and some require a specific version. curl built-in option: -A allows us to specify the browser to visit the website
# curl -A "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0)" http://www.linux.com
The server side will then assume that it was accessed using IE 8.0
7. Forged referer (hotlinking)
Many servers check for referer access by http to control access. For example: you are the first to visit the home page, and then visit the email page in the home page, here access email referer address is a successful visit to the home page after the page address, if the server found that the email page access referer address is not the home address, it is concluded that it is a stolen link
option: -e, built into curl, allows us to set referer
# curl -e "www.linux.com" http://mail.linux.com
This will make the server think you clicked on a link from www.linux.com
8. Download files
8.1: download files using curl.
Use built-in option: -o(lowercase)
# curl http://www.linux.com
3
Use built-in option: -O (uppercase)
# curl http://www.linux.com
4
This saves the file locally with the name on the server
8.2: cyclic download
Sometimes the image can be downloaded to the front of the part of the name is 1, on the last coccyx name is not 1
# curl http://www.linux.com
5
This will save dodo1, dodo2, dodo3, dodo4, dodo5
8.3: download rename
# curl http://www.linux.com
6
Because the names of files in hello and bb are dodo1, dodo2, dodo3, dodo4, dodo5. So the second download overrides the first download, so you need to rename the file.
# curl -o #1_#2.JPG http://www.linux.com/{hello,bb}/dodo[1-5].JPG
If you download hello/ dodo1.JPG, it will be hello_dodo1.JPG, and so on, thus effectively avoiding overwriting the file
8.4: download in blocks
Sometimes the download will be large, at this time we can block download. Use the built-in option: -r
# curl -r 0-100 -o dodo1_part1.JPG http://www.linux.com/dodo1.JPG
# curl -r 100-200 -o dodo1_part2.JPG http://www.linux.com/dodo1.JPG
# curl -r 200- -o dodo1_part3.JPG http://www.linux.com/dodo1.JPG
# cat dodo1_part* > dodo1.JPG
This allows you to view the contents of dodo1.JPG
8.5: download the file via ftp
curl can be downloaded through ftp, which provides two syntaxes to download from ftp
# curl http://www.linux.com
9
8.6: display the download progress bar
# curl -# -O http://www.linux.com/dodo1.JPG
8.7: download progress information will not be displayed
# curl -s -O http://www.linux.com/dodo1.JPG
9. Break point
In windows, we can use software such as xunlei for breakpoint continuation. The curl can achieve the same effect with the built-in option: -C
If you lose your connection while downloading dodo1.JPG, you can use the following method to continue the transmission
# curl -C -O http://www.linux.com/dodo1.JPG
10. Upload files
curl can not only download files, but also upload files. This is done with built-in option: -T
# curl -T dodo1.JPG -u The user name : password ftp://www.linux.com/img/
This uploaddodo1.JPG to the ftp server
11. Display grab error
# curl -f http://www.linux.com/error
Other parameters:
-a/--append When uploading a file, attach it to the target file
--anyauth You can use "any" authentication method
--basic use HTTP Basic authentication
-B/--use-ascii use ASCII Text transmission
-d/--data <data> HTTP POST Mode data transfer
--data-ascii <data> In order to ascii The way of post data
--data-binary <data> In order to 2 Base mode post data
--negotiate use HTTP The authentication
--digest Use digital authentication
--disable-eprt It is prohibited to use EPRT or LPRT
--disable-epsv It is prohibited to use EPSV
--egd-file <file> Is random data (SSL) Set up the EGD socket The path
--tcp-nodelay use TCP_NODELAY options
-E/--cert <cert[:passwd]> Client certificate file and password (SSL)
--cert-type <type> Certificate file type (DER/PEM/ENG) (SSL)
--key <key> Private key file name (SSL)
--key-type <type> Private key file type (DER/PEM/ENG) (SSL)
--pass <pass> The private key password (SSL)
--engine <eng> Encryption engine use (SSL). "--engine list" for list
--cacert <file> CA certificate (SSL)
--capath <directory> CA Orders, (made using c_rehash) to verify peer against (SSL)
--ciphers <list> SSL password
--compressed The request to return is a compressed situation (using deflate or gzip)
--connect-timeout <seconds> Set the maximum request time
--create-dirs Create a directory hierarchy for the local directory
--crlf Upload is to put the LF Into a CRLF
--ftp-create-dirs If the remote directory does not exist, create the remote directory
--ftp-method [multicwd/nocwd/singlecwd] control CWD The use of
--ftp-pasv use PASV/EPSV Instead of the port
--ftp-skip-pasv-ip use PASV when , Ignore this IP address
--ftp-ssl Try to use SSL/TLS To carry out ftp The data transfer
--ftp-ssl-reqd Asked to use SSL/TLS To carry out ftp The data transfer
-F/--form <name=content> simulation http Form submission data
-form-string <name=string> simulation http Form submission data
-g/--globoff Disable url sequence and scope use {} and []
-G/--get In order to get The way to send data
-h/--help help
-H/--header <line> Custom header information is passed to the server
--ignore-content-length Ignore the HTTP Length of header information
-i/--include Output time includes protocol Header information
-I/--head Only document information is displayed
-j/--junk-session-cookies Ignore when reading a file session cookie
--interface <interface> Use the specified network interface / address
--krb4 <level> Use the specified security level krb4
-k/--insecure Allow not to use certificate to SSL site
-K/--config The specified configuration file is read
-l/--list-only list ftp The name of the file in the directory
--limit-rate <rate> Set transmission speed
--local-port<NUM> Force the use of a local port number
-m/--max-time <seconds> Set the maximum transfer time
--max-redirs <num> Sets the maximum number of directories to read
--max-filesize <bytes> Sets the maximum number of files to download
-M/--manual Display full manual
-n/--netrc from netrc The user name and password are read from the file
--netrc-optional use .netrc or URL To cover -n
--ntlm use HTTP NTLM The authentication
-N/--no-buffer Disable buffered output
-p/--proxytunnel use HTTP The agent
--proxy-anyauth Choose any 1 Proxy authentication method
--proxy-basic Use basic authentication on the proxy
--proxy-digest Use digital authentication on the proxy
--proxy-ntlm Used on the agent ntlm The authentication
-P/--ftp-port <address> Use port address instead of use PASV
-Q/--quote <cmd> Before transferring the file, send the command to the server
--range-file Read ( SSL )
-R/--remote-time When a file is generated locally, keep the remote file time
--retry <num> The number of times a retry occurs when a transmission has a problem
--retry-delay <seconds> Set the retry interval when there is a problem with the transmission
--retry-max-time <seconds> Set the maximum retry time when there is a problem with the transport
-S/--show-error Display an error
--socks4 <host[:port]> with socks4 The agent gives the host and port
--socks5 <host[:port]> with socks5 The agent gives the host and port
-t/--telnet-option <OPT=val> Telnet Option is set
--trace <file> To the specified file debug
--trace-ascii <file> Like -- Tracking but not hex The output
--trace-time tracking / When verbose output, add a timestamp
--url <URL> Spet URL to work with
-U/--proxy-user <user[:password]> Set the proxy username and password
-V/--version Display version information
-X/--request <command> What command to specify
-y/--speed-time The time it takes to abandon the speed limit. The default is 30
-Y/--speed-limit Stop transmission speed limit, speed time ' seconds
-z/--time-cond Transfer time setting
-0/--http1.0 use HTTP 1.0
-1/--tlsv1 use TLSv1 ( SSL )
-2/--sslv2 use SSLv2 ( SSL )
-3/--sslv3 The use of SSLv3 ( SSL )
--3p-quote like -Q for the source URL for 3rd party transfer
--3p-url use url For the first 3 The transfer
--3p-user Use your username and password to start the process 3 The transfer
-4/--ipv4 use IP4
-6/--ipv6 use IP6
conclusion
The above is the whole content of this article, I hope the content of this article to your study or work to bring 1 definite help, if you have questions you can leave a message to communicate.