Portal Home > Knowledgebase > Articles Database > wget save as?
wget save as?
Posted by Fahd, 07-21-2007, 04:30 PM |
I need a shell command like wget I can use to download files but I need the option to "save as".
For example, I use
wget -i downloadlist.txt
to download all my files.
But I need the option to save each of the download files with a name of my choice. What is the best tool to do this?
I could do some php code if necessary but would prefer a shell command.
Thanks for any help!
|
Posted by FirmbIT, 07-21-2007, 04:37 PM |
wget www.domain.com/file.ext -O /path/to/file.name
|
Posted by Adam H, 07-21-2007, 04:44 PM |
Since your question is pretty much answered , ill post the wget manual.
http://www.editcorp.com/Personal/Lar...et/wget_7.html
Very good bit of documentation.
|
Posted by Fahd, 07-21-2007, 05:00 PM |
Right but I need to do it for a list of urls in a file. Is there any option to
wget -i downloadlist.txt -savefileas filenames.txt
or similar?
|
Posted by 040Hosting, 07-21-2007, 05:19 PM |
isnt -i -O working then ?
Otherwise create a small shell script going trough your URL list and create the wget with parameters filled by your shell script.
Last edited by 040Hosting; 07-21-2007 at 05:20 PM.
Reason: added sugestion of shell scripting.
|
Posted by knightofshadows, 07-23-2007, 05:12 PM |
you will need to have a bash or perl script to perform this. pm me for help
|
Posted by bear, 07-23-2007, 09:41 PM |
Why can't you post your help here?
|
Posted by dristi, 07-24-2007, 12:13 AM |
for i in file.txt
do
wget $i -O /path/to/file.$
done
Where file.txt will content your input (Website URL).
|
Posted by Fahd, 07-25-2007, 06:00 PM |
OK, allow me to explain myself a little better with an example...
list of urls to be downloaded in urls.txt
www.example.com/something.zip
www.example.com/something-else.jpg
www.example.com/whatever.tar.gz
list of names to be "saved as" in names.txt
Fred.zip
Webhosting.jpg
RHELFC5.tar.gz
I need a command or script that will do the following for the above example...
1. download www.example.com/something.zip and save it as Fred.zip since these are the corresponding first entries in both files.
2. download www.example.com/something-else.jpg and save it as Webhosting.jpg since these are the corresponding second entries in both files.
3. download www.example.com/whatever.tar.gz and save it as RHELFC5.tar.gz since these are the corresponding third entries in both files.
I am not familiar with shell scripting at all. I could do it in php I suppose, but there are a few thousand entries and it eats up phps memory limit.
Thanks for any help!
|
Posted by Tealeaf, 07-25-2007, 10:31 PM |
How about hire someone to do the coding for you?
|
Posted by Fahd, 07-25-2007, 10:39 PM |
I'm open to that. I just thought someone might know this offhand as I didn't think it was that much work.
|
Posted by MaximSupport, 07-25-2007, 11:29 PM |
Dear Fahd,
For this purpose you can write a shell script.
Best Regards.
|
Posted by anatolijd, 07-26-2007, 10:49 AM |
yo, dude!
you may be impressed but to do the task given above you do not even need a script, all can be executed by this single shell line
See the result output just to confirm it`s correct work:
|
Posted by anatolijd, 07-26-2007, 10:55 AM |
The only suggetion is that neither URL`s nor names contains semicolon symbol":". Usually they don`t, but sometimes happens...
|
Posted by 040Hosting, 07-26-2007, 10:56 AM |
Keeping code readable is also nice
Besides your code is a bit more expensive, especially the awk is a resource eater. But a nice one liner (well almost your ';' are actually more lines)
|
Posted by anatolijd, 07-26-2007, 11:16 AM |
i do not care about "awk eating resources" after i used this trick to mysqldump about 1400 mysql databases on my shared hosting server
|
Posted by Fahd, 07-28-2007, 06:38 AM |
Did not work for me. I got errors like...
//www.example.com/something.zip: No such file or directory
//www.example.com/something-else.jpg: No such file or directory
//www.example.com/whatever.tar.gz: No such file or directory
Looks like it stripped the "http:" part from my list of urls.
Thanks for trying though!
|
Posted by 040Hosting, 07-28-2007, 08:34 AM |
try this .. as $1 would be http instead of the url, it should be variable $2 and $3 instead
urls.txt should cotain the urls to the files like
and names.txt the names as these should be saved as
Good luck
Last edited by 040Hosting; 07-28-2007 at 08:38 AM.
Reason: added example file syntax
|
Posted by anatolijd, 07-28-2007, 08:58 AM |
Fahd, it is easy to fix , it is even easier than rainboy`s suggestion as you do not need to modify your urls.txt
Find a difference between the:
, and this one:
in short words - we merge two files into one file with two columns, then read lines one-by-one (url and filename) and pass them through awk utility to form valid wget commands like "wget first.word -O second.word", and finally pass this command to shell interpreter.
awk -F: '{print "wget "$2" -O "$3}' - "" determines string values, so we can write here absolutely everything:
, hope you got the idea...
Last edited by anatolijd; 07-28-2007 at 09:10 AM.
|
Posted by 040Hosting, 07-28-2007, 09:16 AM |
Sorry, but looking at his error it would show he got http:// in his urls.txt already... (even though he showed in an earlier post he didn't). In that case my solution would be more straightforward.
If you would have run the script with http:// you get the errors he showed, as your example did work fine with a urls.txt without http://
Last edited by 040Hosting; 07-28-2007 at 09:16 AM.
Reason: you=we
|
Posted by trancephorm, 08-31-2008, 11:27 AM |
I tried this every possible way and it just doesn't work for me... the errors are various (unsupported scheme, etc...),
here's what I have:
- url.txt with complete URL in every line, and names.txt with names corresponding to urls.
Tried to put this data in one file too, separated by space...
Well I'm definitely shell beginner, but not too unexperienced with it, maybe my mistake is some stupid, but I wouldn't say... Please help, it should be simple...
|
Posted by brianoz, 09-04-2008, 06:58 AM |
Ok ... I AM impressed, very nice bit of code!
Some slight improvements/simplifications (now you've given me the idea, of course!):
This first version gets rid of the Awk. It's still broken though as it will still choke on the http:// -
This second version should actually work - it splits on space (tab) so you need to make sure none of your URLs or filenames have space in them:
As a side note, it's probably smarter and easier to deal with to combine the two files into one file. I'd have the URL and the filename on one line, separated by one or more tabs.
If you decide to do that, this version will work:
I think Shell programming's a bit of a dying art, used to teach it once!!
Last edited by brianoz; 09-04-2008 at 07:02 AM.
|
Add to Favourites Print this Article
Also Read