Using cURL and GNU Wget to Test Websites

This article is part of our Network Troubleshooting series. You can refer to related articles on the links below:

On this article, we will discuss how to examine the output of cURL (also written as “curl”) and GNU Wget (also written as “wget”) utilities to troubleshoot connectivity problems.

GNU Wget is a free non-interactive command line tool that retrieves files using HTTP, HTTPS and FTP. Curl is also a tool for transferring data to and from a server using supported protocols such as HTTP, HTTPS, FTP, FTPS, SCP, SFTP, TFTP.

It can be difficult to pinpoint the source of a web server’s slow performance using a web browser alone as browsers often do not display possibly useful HTTP error codes. A better practice is to test your website’s TCP port 80 response time using telnet and view its output with data you get from the curl and wget tools. An output showing rapid TCP response times but slow curl and wget response times is usually not a network issue but points to a faulty configuration of the web server or any supporting application/database servers used to generate a site’s webpages.

Using curl

Curl acts like a text-based browser that lets you see either the header or the complete HTML code of the webpage displayed on your screen. Use the curl command with the -I flag to view the header of a webpage and the HTTP status code. By not using the -I command you will see all the Web page’s HTML code displayed on the screen. Either method can provide a good idea of your server’s performance.

# curl -I https://example.com
HTTP/1.1 200 OK
Date: Sun, 01 Mar 2015 13:16:38 GMT
Server: Apache/2.2.15 (CentOS)
X-Powered-By: PHP/5.3.3
Connection: close
Content-Type: text/html; charset=UTF-8

In the example above, the web server appears to be working correctly because curl returns a 200 OK code. Go here to get the complete reference of Response Status Codes.

Using wget

Wget can recursively download a site’s pages, including its entire directory, to a local directory. When you use the -N flag, wget’s timestamping feature, you can also view the following data:

  • download speed to display website
  • website file size
  • start and stop times for the site’s download

Wget’s -N flag output is a great snapshot of your server’s performance. Do note that wget do not take into consideration the javascripts, CSS, and 3rd-party servers deployed and employed in your website.


# wget -N example.com
–2015-03-01 08:29:36–  http://example.com/
Resolving example.com (example.com)… 208.83.X.X
Connecting to example.com (example.com)|208.83.X.X|:80… connected.
HTTP request sent, awaiting response… 302 Found
Location: https://example.com/ [following]
–2015-03-01 08:29:36–  https://example.com/
Connecting to example.com (example.com)|208.83.X.X|:443… connected.
HTTP request sent, awaiting response… 200 OK
Length: unspecified [text/html]
Saving to: ‘index.html’

    [ <=>                                   ] 21,412      –.-K/s   in 0.1s

Last-modified header missing — time-stamps turned off.
2015-03-01 08:29:36 (153 KB/s) – ‘index.html’ saved [21412]

Conclusion

Curl and wget can help you identify if there are configurations in your web server (and/or in related application/database servers it uses to generate your webpages) that may be causing slowness in performance.

See our Knowledgebase for more How-To articles.

Comments are closed.