Most of the time when you want to download a file, you righ-click on its URL in your browser and choose "Save Target As", to save it somewhere in your local hard drive. But imagine the case where you want to download about 100 files, saving each file individually will be really boring and time consuming. So let me first introduce you to a tool called "wget"
"GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts", GNU Wget Page.
The good news here is that they have Windows binaries in order to run it on your Windows PC. Now, one more thing to mention is that you can write a simple "For" loop in DOS.
FOR %variable IN (set) DO command [command-parameters]
Let's say we want to download the Holy Quraan from Islam-Way website. Each Sura is stored in a separate file, and these files are named "001.mp3", "002.mp3", etc. And the files are all stored in the following location, "http://download.quran.islamway.com/quran3/133".
So, go to "Start Menu" and click on "Run", and write "cmd.exe" there and hit the "OK" button. Repeat this step three times in order to have three command prompt windows open. Then write the following commands in the open windows to download all the files. Now, make yourself a cup of coffee and grab a book and start reading it till your computer finishes downloading.
for /L %i in (1,1,9) do wget http://download.quran.islamway.com/quran3/133/00%i.mp3 -O 00%i.mp3
for /L %i in (10,1,99) do wget http://download.quran.islamway.com/quran3/133/0%i.mp3 -O 0%i.mp3
for /L %i in (100,1,114) do wget http://download.quran.islamway.com/quran3/133/%i.mp3 -O %i.mp3
Tags: Geek, Windows, Gr33n Data
السلام عليكم
ReplyDeleteلو تسمح عندى سؤالين: الاول هو كيف نحدد اى بارتشين سيتم حفظ الملفات فيه لانه من الواضح انها حتنزل على السى كده و السؤال الثانى اليس من الافضل استخدام برامج التحكم فى التنزيل مثل داونلود اكسيليريتور و ما شابه لضمان استكمال التنزيل فى حالة انقطاعه و هو ما يميز هذه البرامج عن استخدام خاصية حفظ تحت اسم بالضغط على الزر الايمن للفارة؟
جزاكم الله خيرا
مؤمن
شوف يا أستاذ مؤمن
ReplyDeleteفلنفرض مثلا أنك تريد حفظ الملفات في المكان التالي
c:\folder1\folder2
بالتالي عندما تفتح الكوماند برمبت قم بكتابة الأمر التالي قبل الأمر المذكور أعلاه
cd c:\folder1\folder2
أما بخصوص الدونلود أكسلراتور و ما شابه فبالتأكيد له مزاياه لكنك مجبر على حفظ كل ملف على حده و هي عملية مملة
There is an option in the download accelerator to download all the page elements. You choose it and then you have a list of all the page elements, including the 100 files, then untick what you don't want.
ReplyDeleteI know that your way is better of course, but I just wanted to show another way of dealing with things.
Thanks Gohary.
ReplyDeleteIn fact I don't use DAP, and may be that why I am not familiar with its options that much.
السلام عليكم
ReplyDeleteالف شكر اخى طارق على اجابتك
بالفعل داونلود اكسيليتور فيه هذه الخاصية و دائما استخدمها و لكن هذا لا يمنع انى اعرف طرق اخرى فى حالة عدم وجود هذا البرنامج .
الف شكر
مؤمن
العفو يا باشا
ReplyDeleteو مستنيك تنورنا هنا كتير
Instead of using msdos batch, you can simply use a win32 version of an unix shell, and with also all standard unix utils which comes with it (eg. awk,wget,etc).
ReplyDeletehttp://unxutils.sourceforge.net/
http://sourceforge.net/project/showfiles.php?group_id=9328
Concrete example: you've got a standard html page, within some links for mp3s.
wget -q http://www.dragonforce.com/mp3s.php -O - | gawk '/\.mp3/{print $0}' | wget -F -i -
"-i -" reads the standard input and "-O -" writes to the standard output.
First wget downloads the html file containing the links and only (-q) displays the html code to standart output (-O -).
Then this code is filtered with awk through the pipes to only keep the html links for the mp3s (/\.mp3/).
Finaly we use wget again from standart input (-i -) to retreive the mp3s listed in input as html links (-F).
Windows sucks a bit less using win32 unix utils.
msc.
Thanks a lot msc
ReplyDeleteThis is really cool
Nice Article.
ReplyDeletePlease write anything else!
ReplyDeleteThanks to author.
ReplyDeleteHello all!
ReplyDeleteWonderful blog.
ReplyDeleteKtWcVq Hello all!
ReplyDeleteWonderful blog.
ReplyDeleteTHANK YOU SO MUCH!
ReplyDelete