This is very handy if you're too busy and don't have time to download whatever files you need.
The -D specifies the domains allowed, this is because I specified -H which means foreign hosts are allowed, if you don't restrict them you'll end up going to the whole internet via ads and other links just like a search Engine would follow.
-l 0 specifies to go deep, to as many levels as possible/as exist.
-e robots=off is important because robots.txt often says you can't view certain files which we don't want (this is intended for search engines).
-A .zip specifies that we only want to download .zip files, you can add more, change it from zip to jpg or remove it all together if you like.
--user and --password are hopefully obvious
wget -l 0 -e robots=off --keep-session-cookies -A .zip -H -D allowedomain.com,otherdomain.com -U "Mozilla/5.0 (X11; Linux i686; rv:12.0) Gecko/20100101 Firefox/12.0" -m --user=user --password=pass http://url.com
login, download, scriptthis, specifies, domains, specified, hosts, restrict, ll, via, ads, links, levels, robots, txt, engines, zip, jpg, user, password, wget, allowedomain, otherdomain, quot, mozilla, linux, rv, gecko, firefox, http, url,