Announcement

Collapse
No announcement yet.

How To Use the wget Command in Linux

Collapse
X
Collapse
  •  

  • How To Use the wget Command in Linux


    Data transmission is one of the most invaluable tasks in today’s internet world. Although numerous tools are available on the web to download files, Linux is one step ahead. The wget utility in Linux is a simple, powerful, and proficient tool for downloading files using download links.
    The wget command includes various features, such as resuming interrupted downloads, speed and bandwidth customization, encrypted downloads, and simultaneous file downloads. Moreover, it can interact with Rest APIs. So, in this brief tutorial, we will cover all the ways to use the wget command in Linux.
    How To Use the wget Command in Linux

    Whether you need a single file or want to download the entire file set, the wget utility helps you achieve both tasks. It also offers a few options to tweak its overall functioning. The standard wget command lets you download a file from a website. For example, to download jquery-3.7.1.js from its official website, please use the wget command:
    wget https://code.jquery.com/jquery-3.7.1.js




    The wget command, by default, saves the downloaded files in the current directory with their original names listed on the website. However, you can save it at a specific location or with a particular name through the ‘-O’ option. For instance, you can use the below wget command to save the above file with the name JavaScript.js:
    wget -O JavaScript.js https://code.jquery.com/jquery-3.7.1.js




    Similarly, to download the file at another path without changing the current directory, please mention the new file path along with the desired filename:
    wget -O ~/Downloads/JavaScript.js https://code.jquery.com/jquery-3.7.1.js




    If your download fails, you can resume it from where it left off using the ‘–continue’ or ‘-c’ option:
    wget -c https://code.jquery.com/jquery-3.7.1.js




    While downloading a file, if you’re also performing other online tasks that require sufficient internet bandwidth, use the ‘-limit-rate’ option to limit its speed.
    wget --limit-rate=50k https://code.jquery.com/jquery-3.7.1.js




    Here, ’50k’ means limiting the speed to 50KB/s for the specified file. However, you can replace it with your desired limit. This is usually helpful when you do not want the wget command to consume all available bandwidth.
    The most powerful feature of the wget utility is its ability to download entire websites recursively. You can use the ‘-r’ or ‘–recursive’ option to download all HTML pages, linked files, CSS, and images. For example:
    wget -r https://code.jquery.com/jquery-3.7.1.js




    Conclusion

    The wget command is a powerful and versatile tool for downloading files from URLs. This brief tutorial explains how to use the wget command and its applications. Its prominent feature is recursive website download, but it also allows renaming downloaded files and resuming uninterrupted downloads. Moreover, if you have a low bandwidth, use the ‘–limit-rate’ option to limit the download speed.




    More...
      Posting comments is disabled.

    Categories

    Collapse

    Article Tags

    Collapse

    There are no tags yet.

    Latest Articles

    Collapse

    • Using MAXQDA for Qualitative Data Analysis on Linux
      by Kasimba



      by George Whittaker


      Introduction

      Qualitative data analysis (QDA) is a cornerstone of research across various fields, from social sciences to marketing. It involves uncovering patterns, themes, and meanings within non-numerical data such as interviews, focus groups, and textual narratives. In this era of digital tools, MAXQDA stands out as a premier software solution for QDA, empowering researchers to organize...
      11-21-2024, 11:31 PM
    • HAProxy on Ubuntu: Load Balancing and Failover for Resilient Infrastructure
      by Kasimba



      by german.suarez


      Introduction

      In today’s fast-paced digital landscape, ensuring the availability and performance of applications is paramount. Modern infrastructures require robust solutions to distribute traffic efficiently and maintain service availability even in the face of server failures. Enter HAProxy, the de facto standard for high-performance load balancing and failover.


      This article...
      11-21-2024, 03:00 PM
    • Providing a license for package sources
      by Kasimba
      Arch Linux hasn't had a license for any package sources (such as PKGBUILD files) in the past, which is potentially problematic. Providing a license will preempt that uncertainty.

      In RFC 40 we agreed to change all package sources to be licensed under the very liberal 0BSD license. This change will not limit what you can do with package sources. Check out the RFC for more on the rationale and prior discussion.

      Before we make this change, we will provide contributors with...
      11-19-2024, 09:21 AM
    • Linux Binary Analysis for Reverse Engineering and Vulnerability Discovery
      by Kasimba



      by George Whittaker


      Introduction

      In the world of cybersecurity and software development, binary analysis holds a unique place. It is the art of examining compiled programs to understand their functionality, identify vulnerabilities, or debug issues—without access to the original source code. For Linux, which dominates servers, embedded systems, and even personal computing, the skill of binary analysis is...
      11-18-2024, 07:10 PM
    • Ubuntu vs Debian: Linux Distributions Compared Deep Dive
      by Kasimba
      Debian and Ubuntu are two popular Linux distributions. In this deep dive we will guide you on the key differences between them from perspective of both corporate enterprise and personal productivity or pleasure usage. After reading this blog post you should be in a better position to decide to select Ubuntu or Debian.
      Stewardship, Licensing, Community and Cost

      Where as Debian is 100% fully committed to free software as defined by the Debian Free Software Guidelines, Ubuntu is created...
      11-17-2024, 08:30 PM
    • Debian Backup and Recovery Solutions: Safeguard Your Data with Confidence
      by Kasimba



      by George Whittaker


      Introduction

      In the digital age, data loss is a critical concern, and effective backup and recovery systems are vital for any Debian system administrator or user. Debian, known for its stability and suitability in enterprise, server, and personal computing environments, offers a multitude of tools for creating robust backup and recovery solutions. This guide will explore these solutions,...
      11-13-2024, 05:30 PM
    Working...
    X