I suggest you use wget instead.
p - gets not only the HTML files
k - convert the links in a way to be able to use the web pages locally
l - subdirectories level
np - --no-parent directories
c - continue if case network failure or something else. You can run the command again and it will not get the downloaded files again
U - user agent
random-wait - wait until next request
You can provide the user if login is needed. Add --user=username --ask-password to the options. Do not use --password="" or similar as ftp://user:[email protected]/dir address because it will be in cmd history
You may add -R html htm to discard certain files
-nH - This option will force wget not to create host.com directory
# download the whole directory wget -rpk -l 10 -np -c --random-wait -U Mozilla http://ftp.debian.org/debian/docr - recursive download
p - gets not only the HTML files
k - convert the links in a way to be able to use the web pages locally
l - subdirectories level
np - --no-parent directories
c - continue if case network failure or something else. You can run the command again and it will not get the downloaded files again
U - user agent
random-wait - wait until next request
You can provide the user if login is needed. Add --user=username --ask-password to the options. Do not use --password="" or similar as ftp://user:[email protected]/dir address because it will be in cmd history
You may add -R html htm to discard certain files
-nH - This option will force wget not to create host.com directory