Automatically resume interrupted downloads in OSX with curl

I recently found myself at the wrong end of a crappy internet connection and needing to download a 97MB file. Safari and Chrome would both choke during the download and leave me with a unusable, truncated 4MB file, with no way to resume the download. The OSX command-line download utility, curl, can resume downloads, but I had to check manually each time the download was interrupted and re-run the command.

After typing “up, enter” more times than I care to admit, I decided to figure out the automatic way. Here’s my bash one-liner to get automatic download resume in curl:

export ec=18; while [ $ec -eq 18 ]; do /usr/bin/curl -O -C - "http://www.example.com/a-big-archive.zip"; export ec=$?; done

Explanation: the exit code curl chucks when a download is interrupted is 18, and $? gives you the exit code of the last command in bash. So, while the exit code is 18, keep trying to download the file, maintaining the filename (-O) and resuming where the previous download left off (-C).

Update: As Jan points out in the comments, depending what is going wrong with the download, your error code may be different. Just change “18” to whatever error you’re seeing to get it to work for you! (If you’re feeling adventurous, you could change the condition to while [ $ec -ne 0 ], but that feels like using a bare except in Python, which is bad. ;)

4 thoughts on “Automatically resume interrupted downloads in OSX with curl

  1. Thanks, this helped me squeezing a 2GB OSX update through a shaky Colombian wifi connection. I had some “(56) Recv failure: Operation timed out” error codes and had to press enter manually. Perhaps the script could be improved to include multiple error codes in the future.

  2. Hello I am not a computer expert but I have lots of huge files to download and I have been hitting resume for days. Please help! I opened terminal and copied the link (inserting the website link) and this is what appears. What am I doing wrong?

    Last login: Sat Jan 14 01:53:16 on ttys000
    MGBLON04D9004old:~ nayana$ export ec=18; while [ $ec -eq 18 ]; do /usr/bin/curl -O -C – https://insidetheedit.com/shop/; export ec=$?; done
    curl: Remote file name has no length!
    curl: try ‘curl –help’ or ‘curl –manual’ for more information
    MGBLON04D9004old:~ nayana$

    Thanks in advance!

    1. Hi Nayana,

      curl can’t download a whole folder of stuff, but rather individual files. So, for example:

      curl -O -C https://insidetheedit.com/shop/file.mp4

      should work, but

      curl -O -C https://insidetheedit.com/shop/

      won’t.

      Additionally, if you need to log in to download the files, I suspect you need to pass additional options to curl:

      curl -O -C -u myusername:mypassword https://insidetheedit.com/shop/file.mp4

      I hope this helps!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s