Hello people of the internet!
I am here to ask if there is a good way for me to concurrently download a list of files with async. Since this list can be quite long (~4000 files), I can't simply download all of them at once, but instead limit the number of concurrent downloads to some arbitrary number.
However, I cannot find any standard library function or tool that can help me do this. Ideally I would also be able to reuse AsyncHttpClients multiple times as well. In python, this can be done using asyncio.Semaphore, and I'm not sure if there's some equivalent in Nim.
In addition, I would like to at some point add a nice progress bar, perhaps with a download speed indicator, but that's less important than just getting it working.
Cool, seems to work. This is the code that I've come up with.
proc downloadLevels(urls: seq[Uri], folder: string, threads: int = 8): Future[void] =
## Concurrently downloads multiple urls into given folder.
## `threads` is the maximum concurrency of the download.
var mainFuture = newFuture[void]("downloadLevels")
let clients = newSeqWith(threads, newAsyncHttpClient())
var remainingUrls = urls
var finishedClients = 0
proc bump(clientIndex: int) =
if remainingUrls.len == 0:
inc finishedClients
if finishedClients == threads:
mainFuture.complete()
return
let currentUrl = remainingUrls.pop()
let currentClient = clients[clientIndex]
var fut = currentClient.downloadLevelSafe(currentUrl, folder)
fut.addCallback do ():
bump(clientIndex)
for i in 0..<threads:
bump(i)
return mainFuture
First of all, threads != async and your code won't be executed in threads (just in case, as you're using threads here as an argument).
Secondly, you don't need to copy urls. Just make a counter x = urls.len for remaining URLs, let currentUrl = urls[x - 1] and decrease x until 0.