C car Registered User Messages 1,397 20 Oct 2008 #1 Im testing something and cant think/remember how to DOS copy a file from a URL. pseudo... copy "http:\\www.mysite.com\file.pdf" c:\my_download_dir anyone know how or got a utility, I googled for a few mins but no avail. Its not an ftp site.
Im testing something and cant think/remember how to DOS copy a file from a URL. pseudo... copy "http:\\www.mysite.com\file.pdf" c:\my_download_dir anyone know how or got a utility, I googled for a few mins but no avail. Its not an ftp site.
M mickoneill30 Registered User Messages 48 20 Oct 2008 #2 How about this? http://www.ericphelps.com/webget/index.htm
C car Registered User Messages 1,397 20 Oct 2008 #3 fair play mick, that slurp utility does what I want, once I extract the zip, I just use slurp instead of copy and pass in the commands.
fair play mick, that slurp utility does what I want, once I extract the zip, I just use slurp instead of copy and pass in the commands.