Multi-protocol support: It supports HTTP, HTTPS and FTP protocols to let you download files from internet. HTTP-Pipelining: In addition to dynamic file segmentation, each segment is further accelerated up to six timesĪuto Segmentation: When one segment ends, it starts another segment to help terminate another segment more fastĮnhanced Browser Integration: The only download manager for Linux which integrates with all the browsers to snatch away download url and start the download by itself.
It supports up to 32 segments per download here is it's features:ĭynamic File Segmentation: It uses a robust dynamic file segmentation algorithm to speed up the download. it's free but you have to buy browser integration. But it's obsolete and solves it partially. But this only a bad choice for me as the number pause/resume is too high and the time to make connection to the server (where download file locate ) is a wastage of time and bandwidth.Īria1 can do some job by -stop switch. I know killall -9 $(pidof downloader) should stop all downloads and can be resumed from the last point. I'm looking for a download manager which is capable of throttle bandwidth and pause/resume all (or one ) the running instance of download upon execution of a -throttle=10K, -pasue switch or something like that, and resume it on a -resume switch (Or using other ways such as RPC).
#Flareget command line portable#
So a portable version of aria2 is also considerable. This RPC option for aria2 seems to works from Ubuntu 11.10.
#Flareget command line how to#
How to pause aria2 download is a partial solution but option pause is not supported in my aria2 (1.8.0, Ubuntu 10.04 LTS) and I can't upgrade aria2 (From apt-get install or manually install ). I know a process can be paused by kill -STOP "$pid" The pausing should not close the connection to the website, either it should wait for resume command or the bandwidth throttled to very low data useage.
I want to pause downloads too many times depending on events such as power failure, Network unavailability, etc. First start download, then kill wget, then resume download by wget -c. A big file can be downloaded by using wget. When downloading big files it necessary to pause the download many times.