wget can download a complete website by recursively collecting the URL links and downloading them like a crawler. To download the pages, use the --mirror option:
$ wget --mirror --convert-links exampledomain.com
Alternatively, use the following command:
$ wget -r -N -l -k DEPTH URL
The -l option specifies the depth of web pages as levels. This means that it will traverse only that number of levels. It is used along with -r (recursive). The -N argument is used to enable time stamping for the file. URL is the base URL for a website for which the download needs to be initiated. The -k or --convert-links option instructs wget to convert the links to other pages to the local copy.
Exercise discretion when mirroring other websites. Unless you have permission, only perform this for your personal use and don't do it too frequently.