How to Get and Download all File Type Links from a Web Page - Linux
This tutorial explains how to take a URL and get all of the links for a specific file type (pdf, jpg, mp3, wav, whatever extension you want) exported into a list and download all of the links in Linux. In my example, I have a web page with over 20 links to pdf files. Instead of downloading them individually and manually, this script will allow me to download all of them at one time, and give me a list of each link.
You need to have lynx and wget installed before running this script. To install, run the following command:
Ubuntu: sudo apt-get install lynx-cur wget
openSUSE: sudo zypper install lynx wget
Save the following text as link-dl.sh and execute it by running "sh link-dl.sh":