The wget
command is used to download a website, specifically excluding directories /forums
, /files
, /images
, /gallery
, /nc
, and /diane
. The command uses options --mirror
, --convert-links
, -nc
, and -r
to retrieve the site recursively and handle links and failed downloads.
wget --mirror --convert-links -nc -r http://www.truecombatelite.com/mapdepot/ -X /forums -X /files -X /images -X /gallery -X /nc -X /diane
bash
#!/bin/bash
# Define the URL to download
BASE_URL="http://www.truecombatelite.com/mapdepot/"
# Define the excluded directories
EXCLUDED_DIRS="/forums /files /images /gallery /nc /diane"
# Define the wget options
WGET_OPTIONS=(
"--mirror" # Download the entire site
"--convert-links" # Convert links to relative URLs
"-nc" # Don't re-download files that are already there
"-r" # Recursively download the site
)
# Construct the wget command
WGET_CMD="wget"
# Add the options to the wget command
for opt in "${WGET_OPTIONS[@]}"; do
WGET_CMD+=" $opt"
done
# Add the excluded directories to the wget command
WGET_CMD+=" -r -A '*' -R $(echo $EXCLUDED_DIRS | tr'' '\n' | sed '/^$/d')"
# Add the base URL to the wget command
WGET_CMD+=" $BASE_URL"
# Print the final wget command
echo "$WGET_CMD"
This is a wget
command used for downloading a website, specifically excluding certain directories.
--mirror
: Recursively retrieve the entire site, including files and directories.--convert-links
: Convert links to point to local files, making them usable offline.-nc
: Do not continue if a single file fails to download (no continuation).-r
: Turn on recursive retrieving (similar to --mirror)./forums
/files
/images
/gallery
/nc
/diane
http://www.truecombatelite.com/mapdepot/
This is the URL to download from.