75

How can I download ZIP file with curl command? I tried curl -sO, but error occurred. I want to download zip file from address: http://github.com/ziyaddin/xampp/archive/master.zip

but can't. What must I type?

Meintjes
  • 2,420
  • 1
  • 15
  • 21
Ziyaddin Sadygly
  • 7,329
  • 6
  • 26
  • 34

8 Answers8

79

I used curl -LO and it worked fine. wget works too.

Ziyaddin Sadygly
  • 7,329
  • 6
  • 26
  • 34
35

Try wget instead of curl: wget http://github.com/ziyaddin/xampp/archive/master.zip

don.joey
  • 28,662
31

You can use:

curl https://github.com/ziyaddin/xampp/archive/master.zip -O -J -L

Saves as: xampp-cli-master.zip

  • use -L in case there is a redirect found.
  • use -O for remote filenames (master.zip)
  • use -J (use with -O) to allow remote header filename (xampp-cli-master.zip)


Or use -o to create your own filename:

curl https://github.com/ziyaddin/xampp/archive/master.zip -L -o MyFilename.zip

Saves as: MyFilename.zip

Note: (GitHub supports SSL - use https://)


You could also use a curlrc config file or alias the CURL command to use curl -O -L to allow it to work similar to WGET.

Consult: https://curl.haxx.se/docs/manpage.html#OPTIONS (See: -O, -J, -L, -o)

Note the warning of using CURL with the -J option:

There's no attempt to decode %-sequences (yet) in the provided file name, so this option may provide you with rather unexpected file names.

WARNING: Exercise judicious use of this option, especially on Windows. A rogue server could send you the name of a DLL or other file that could possibly be loaded automatically by Windows or some third party software.

B. Shea
  • 1,188
  • 14
  • 17
7

If you want to download the file use wget [option]... [URL]... instead.

For more information regarding the options, just type this into your Terminal: man wget

For you purpose:

wget http://github.com/ziyaddin/xampp/archive/master.zip

Note that the .zip file will be saved in the current directory you are in.

Meintjes
  • 2,420
  • 1
  • 15
  • 21
  • 1
    It would be helpful to explain the advantages of using wget over of curl. For anyone wondering the differences between the two see here. – j-- Dec 09 '14 at 17:50
  • @JorgeBucaran I read the comparison written by the author of curl who also contributes to wget in a minor role. One glaring omission (based on my one time small project) is that wget is 10 times faster than curl (<2 seconds vs. 12 seconds) retrieving sunrise and sunset times from https://www.timeanddate.com – WinEunuuchs2Unix Mar 03 '17 at 02:49
  • So, sometimes when you use wget and the file is not served directly but instead the url tells a service where to locate and serve the file, what you end up downloading is a html. So curl is better for some files instead – lesolorzanov May 28 '19 at 11:51
4

To download files in GitHub (or any other site that make redirects) using curl, you must pass -L (Follow redirects), so use:

curl -L http://github.com/ziyaddin/xampp/archive/master.zip -o xampp.zip

1

"curl -LOk" makes it using insecure protocols (-k), disabling SSL certificate checks (which fail because of the incorrect path not matching the certificate), and create the output file using the filename (-O) proposed from the remote server (also insecure). In my opinion, the server should better be configured create local redirecting aliases where needed, to avoid having to pass this non conforming part of the URL. But the redirection is made using an HTTP redirect only (HTTP error 30x), which causes problems to curl that does not follow (by default) this (unsecure) redirection, and option (-k) disables this security check. So instead of using HTTP redirects (moved permanently), please add some internal link on the filesystem of the server. If this is needed because the URL is in fact hosted on another physical server with no direct access between their filesystems, you should use another DNS name for this host, so that clients will query the correct one directly, without any redirect. Then fix your web pages (you may use your webserver logs to see where HTTP 30x redirects are returned and which webpages may need to be checked and updated). Forcing clients to use unsecure/unchecked protocols is not a good option.

So this is definitely a problem of very bad configuration on the web server: try contacting their admin to fix that.

verdy_p
  • 141
0

well, you can use Axel as well. axel is a light command line download accelerator. it supports HTTP, HTTPS, FTP and FTPS protocols. its nice and faster alternative.

install axel using :

apt-get install axel

then download your file by:

axel http://github.com/ziyaddin/xampp/archive/master.zip
Mahesh
  • 61
0

Use the option -L to follow redirects, you can use also the --output option to give a path and a name to the zip file. For example:

sudo curl -L https://github.com/CISOfy/lynis/archive/master.zip --output /etc/lynis.zip

Note: Use sudo if you don't have permissions to write to the target folder.