How To Download From Github Linux

broken image


This guide will teach you step by step how to download files from the command line in Linux, Windows or macOS using open source (free) software – wget. Wget is a very cool command-line downloader for Linux and UNIX environments that has also been ported to Windows and macOS. Don't be fooled by the fact that it is a command line tool. It is very powerful and versatile and can match some of the best graphical downloaders around today. It has features such as resuming of downloads, bandwidth control, it can handle authentication, and much more. I'll get you started with the basics of using wget and then I'll show you how you can automate a complete backup of your website using wget and cron.

Let's get started by installing wget. Most Linux distributions come with wget pre-installed. If you manage to land yourself a Linux machine without a copy of wget try the following. On a Red Hat Linux based system such a Fedora you can use:

# yum install wget

Git is by far one of the most popular version control system available for developers. Created in 2005 by Linus Torvalds, the creator of the Linux operating system, Git is built as a distributed environment enabling multiple developers and teams to work together on the same codebase. Linux Mint Mate 19.1 k/4.15, Windows 10, Manjaro 18.03 Xfce k/4.19.24-1 MX-Linux 18.1 Lenovo Ideapad G50-45 8GB Lenovo Ideapad 330S-15IKB, Core i5-8250U, 8GB RAM, Samsung EVO 850 SATA SSD, 14GB Optane M.2 PCIe NVMe, QCA9377 WiFi. New Github accounts come with a prefab repo populated by a README file, license, and buttons for quickly creating bug reports, pull requests, Wikis, and other useful features. Free Github accounts only allow public repositories. This allows anyone to see and download your files. Im trying to get an app from github and install it on Ubuntu. Usually in windows I just download an exe or msi file and 2x click it. But in Ubuntu its either using a PPA or a deb file or from github. I tried googling some guides regarding github it & the closest to the concern is.

or if you use a Debian based system like Ubuntu:

How to install chrome from terminal. # sudo apt-get install wget

One of the above should do the trick for you. Otherwise, check with your Linux distribution's manual to see how to get and install packages. Users on Windows can access wget via this website, and for Mac users we have a full guide on how to install wget in macOS.

The most basic operation a download manager needs to perform is to download a file from a URL. Here's how you would use wget to download a file:

# wget https://www.simplehelp.net/images/file.zip

Yes, it's that simple. Now let's do something more fun. Let's download an entire website. Here's a taste of the power of wget. If you want to download a website you can specify the depth that wget must fetch files from. Say you want to download the first level links of Yahoo!'s home page. Here's how would do that:

# wget -r -l 1 https://www.yahoo.com/

Here's what each options does. The -r activates the recursive retrieval of files. How to drag clips into imovie. The -l stands for level, and the number 1 next to it tells wget how many levels deep to go while fetching the files. Try increasing the number of levels to two and see how much longer wget takes.

Now if you want to download all the 'jpeg' images from a website, a user familiar with the Linux command line might guess that a command like 'wget http://www.sevenacross.com*.jpeg' would work. Well, unfortunately, it won't. What you need to do is something like this:

# wget -r -l1 –no-parent -A.jpeg https://www.yahoo.com

Another very useful option in wget is the resumption of a download. Say you started downloading a large file and you lost your Internet connection before the download could complete. You can use the -c option to continue your download from where you left it.

# wget -c http://www.example_url.com/ubuntu-live.iso

Now let's move on to setting up a daily backup of a website. The following command will create a mirror of a site in your local disk. For this purpose wget has a specific option, –mirror. Try the following command, replacing sevenacross.com with your website's address.

# wget –mirror http://www.sevenacross.com/

When the command is done running you should have a local mirror of your website. This make for a pretty handy tool for backups. Let's turn this command into a cool shell script and schedule it to run at midnight every night. Open your favorite text editor and type the following. Remember to adapt the path of the backup and the website URL to your requirements.

#!/bin/bash

Download

YEAR=`date +'%Y'`
MONTH=`date +'%m'`
DAY=`date +'%d'`

BACKUP_PATH=`/home/backup/` # replace path with your backup directory
WEBSITE_URL=`http://www.sevenacross.net` # replace url with the address of the website you want to backup

# Create and move to backup directory
cd $BACKUP_PARENT_DIR/$YEAR/$MONTH
mkdir $DAY
cd $DAY

wget –mirror ${WEBSITE_URL}

Now save this file as something like website_backup.sh and grant it executable permissions:

Download From Github Cmd

How To Download From Github Linux

YEAR=`date +'%Y'`
MONTH=`date +'%m'`
DAY=`date +'%d'`

BACKUP_PATH=`/home/backup/` # replace path with your backup directory
WEBSITE_URL=`http://www.sevenacross.net` # replace url with the address of the website you want to backup

# Create and move to backup directory
cd $BACKUP_PARENT_DIR/$YEAR/$MONTH
mkdir $DAY
cd $DAY

wget –mirror ${WEBSITE_URL}

Now save this file as something like website_backup.sh and grant it executable permissions:

Download From Github Cmd

# chmod +x website_backup.sh

Open your cron configuration with the crontab command and add the following line at the end:

Downloading Programs From Github

0 0 * * * /path/to/website_backup.sh

Linux Install From Github

You should have a copy of your website in /home/backup/YEAR/MONTH/DAY every day. For more help using cron and crontab, see this tutorial. How do i make a zip file.

Now that you get the basics of downloading files from the command line you can get into the advanced stuff by reading up wget's man page – just type man wget from the command line.

If this article helped you, I'd be grateful if you could share it on your preferred social network - it helps me a lot. If you're feeling particularly generous, you could buy me a coffee and I'd be super grateful :)




broken image