46

I'm writing a script to download a bunch of files, and I want it to inform when a particular file doesn't exist.

r=`wget -q www.someurl.com`
if [ $r -ne 0 ]
  then echo "Not there"
  else echo "OK"
fi

But it gives the following error on execution:

./file: line 2: [: -ne: unary operator expected

What's wrong?

7 Answers 7

83

Others have correctly posted that you can use $? to get the most recent exit code:

wget_output=$(wget -q "$URL")
if [ $? -ne 0 ]; then
    ...

This lets you capture both the stdout and the exit code. If you don't actually care what it prints, you can just test it directly:

if wget -q "$URL"; then
    ...

And if you want to suppress the output:

if wget -q "$URL" > /dev/null; then
    ...
1
  • Is it standard that if will get the exit code of the program when this form is used? (if <program>; then...)
    – Vassilis
    Commented Nov 8, 2013 at 16:53
47

$r is the text output of wget (which you've captured with backticks). To access the return code, use the $? variable.

1
  • 4
    While this is correct, a better explanation would be why $r is blank and why the error message occurs. Commented Apr 3, 2014 at 12:06
18

$r is empty, and therefore your condition becomes if [ -ne 0 ] and it seems as if -ne is used as a unary operator. Try this instead:

wget -q www.someurl.com
if [ $? -ne 0 ]
  ...

EDIT As Andrew explained before me, backticks return standard output, while $? returns the exit code of the last operation.

1
  • 6
    this is the only answer that explains why the asker is getting that error message. +1.
    – Lesmana
    Commented Jan 2, 2013 at 21:25
12

you could just

wget ruffingthewitness.com && echo "WE GOT IT" || echo "Failure"

-(~)----------------------------------------------------------(07:30 Tue Apr 27)
risk@DockMaster [2024] --> wget ruffingthewitness.com && echo "WE GOT IT" || echo "Failure" 
--2010-04-27 07:30:56--  http://ruffingthewitness.com/
Resolving ruffingthewitness.com... 69.56.251.239
Connecting to ruffingthewitness.com|69.56.251.239|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `index.html.1'

    [ <=>                                                                                    ] 14,252      72.7K/s   in 0.2s    

2010-04-27 07:30:58 (72.7 KB/s) - `index.html.1' saved [14252]

WE GOT IT
-(~)-----------------------------------------------------------------------------------------------------------(07:30 Tue Apr 27)
risk@DockMaster [2025] --> wget ruffingthewitness.biz && echo "WE GOT IT" || echo "Failure"
--2010-04-27 07:31:05--  http://ruffingthewitness.biz/
Resolving ruffingthewitness.biz... failed: Name or service not known.
wget: unable to resolve host address `ruffingthewitness.biz'
zsh: exit 1     wget ruffingthewitness.biz
Failure
-(~)-----------------------------------------------------------------------------------------------------------(07:31 Tue Apr 27)
risk@DockMaster [2026] --> 
3
  • 2
    Code only answer are discouraged. Please write some explanation on how it works?
    – foobar
    Commented Nov 29, 2019 at 12:45
  • How does it work?
    – Stelios333
    Commented Dec 2, 2021 at 16:31
  • && and || will run the next statement if the previous succeeded or failed, respectively. Commented Sep 21, 2023 at 12:09
6

Best way to capture the result from wget and also check the call status

wget -O filename URL
if [[ $? -ne 0 ]]; then
    echo "wget failed"
    exit 1; 
fi

This way you can check the status of wget as well as store the output data.

  1. If call is successful use the output stored

  2. Otherwise it will exit with the error wget failed

3

I been trying all the solutions without lucky.

wget executes in non-interactive way. This means that wget work in the background and you can't catch the return code with $?.

One solution it's to handle the "--server-response" property, searching http 200 status code Example:

wget --server-response -q -o wgetOut http://www.someurl.com
sleep 5
_wgetHttpCode=`cat wgetOut | gawk '/HTTP/{ print $2 }'`
if [ "$_wgetHttpCode" != "200" ]; then
    echo "[Error] `cat wgetOut`"
fi

Note: wget need some time to finish his work, for that reason I put "sleep 5". This is not the best way to do but worked ok for test the solution.

0

The CURL utility has the --dump-header option, which allows you to upload the response headers to a separate file.

tmp=`mktemp -u`
curl --silent --dump-header "$tmp" -o ./retOut http://www.someurl.com`
sleep 5
_wgetHttpCode=`cat ${tmp} | gawk '/HTTP/{ print $2 }'`
if [ "$_wgetHttpCode" != "200" ]; then
    cat "${tmp}"
fi
[ -e "$tmp" ] && rm "$tmp"
1
  • As it’s currently written, your answer is unclear. Please edit to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers in the help center.
    – Community Bot
    Commented May 11 at 5:54

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Not the answer you're looking for? Browse other questions tagged or ask your own question.