Jun 072012

From the command line, what’s an easy way to test that a web page is up and displaying the content you expect? It could be that you need to write a monitoring script to check a remote site, or you just want to quickly ascertain whether you can access the internet from the command line, from wherever you’re logged in. wget is a great little tool for the job.

The wget tool is a wonderful Swiss-army knife for command line web browsing. It’s frequently used for recursively grabbing entire sites and their contents, but it can also be used as a very lightweight means of querying the web.

The -O switch indicates an output filename, but adding “-” to this will just send the contents of the page to the standard output. From here, you can use a pipe and redirect wget into your favourite pattern matching utility.

 # wget -O - http://www.example.com/statuspage.html | grep OK

Where “OK” could be any text pattern known to appear on the web page.

Or, if it’s an HTTPS page, ignore the certificate so you don’t get a pesky error:

 # wget -O - --no-check-certficiate - https://secure.example.com/status.html | grep OK

The extensive man page discusses further uses for this wonderful utility. Enjoy the read.

Matt Parsons is a freelance Linux specialist who has designed, built and supported Unix and Linux systems in the finance, telecommunications and media industries.

He lives and works in London.

  3 Responses to “Using “wget” to test web server connectivity”

  1. IMO grep is redundant, this is better:
    # wget -q http://www.example.com/statuspage.html
    and check exit code. .

  2. Thank you! This is exactly what I was looking for. Just want to check if our Oracle Applications login page is up.

 Leave a Reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>