Hello. What is a way in Nim to get few http responses asynchronous? How to send few requests at once and then get responses asynchronous?
Thanks
Try this:
import asyncdispatch, httpclient, json, strtabs
proc main() {.async.} =
var client = newAsyncHttpClient()
var resp = await client.request("http://www.yahoo.com/")
echo resp.status
#echo repr resp.headers
#echo resp.body
client.headers = newStringTable({
"Content-Type": "application/json"
})
var body = %*{
"username": "tom"
}
resp = await client.request(url = "http://www.yahoo.com/",
httpMethod = httpPost,
body = $body)
echo resp.status
client.close()
waitFor main()
You described synchronous requests.
I mean something like this in js:
var p1 = new Promise(...),
p2 = new Promise(...)
p3 = Promise.all([p1, p2]);
p3.then(...).catch(...);
Requests must be asynchronous. What is a way to do that in Nim?
I think what is missing here is a way to collect the futures together when they are done in a single future. It could be something like (I did not try to compile it)
proc all[A](fs: seq[Future[A]]): Future[seq[A]] =
result = newFuture[seq[A]](fromProc = "all")
var
items = newSeq[A](fs.len)
count = 0
for i, f in fs:
f.callback = proc(g: Future[A]) =
items[i] = g.read
count += 1
if count == fs.len:
result.complete(items)
Is there a simpler, more readable way to send multiple HTTP requests asynchronously? It's a common task, and the solution should be trivial. Unfortunately, my naive attempts like the one below were unsuccessful so far.
# INCORRECT CODE, DO NOT USE
import asyncdispatch, httpclient
proc doRequest(url: string): Future[Response] {.async.} =
let
client = newAsyncHttpClient()
response = await client.get(url)
return response
let
response1 = waitFor doRequest("http://localhost:8080") # On localhost I have a dummy server that sleeps for 1 sec before response.
response2 = waitFor doRequest("http://localhost:8080") # This blocks for 2 seconds.
I do understand that waitFor blocks, but what is the right way to do it instead?
Replacing waitFor with asyncCheck does not trigger the requests.
Yeah, I saw it, and I don't mean to offend, but this looks kind of way too complicated for such a simple task :-)
Also, your approach is based on callbacks, which I feel is considered bad compared to coroutine style.
I am sure there must be a better way :-)
My approach is based on callbacks because that is pretty much the only way to attach behaviour to futures :-)
In any case, the all function needs only be written once. Then, you can do
let
req1 = doRequest("http://localhost:8080")
req2 = doRequest("http://localhost:8080")
bothResponses = waitFor all(@[req1, req2])
Admittedly I did not try, but it should work. The all functions collects some futures together. Then, you can wait on it, or whatever.
I did that
proc asyncGet(s: string): Future[Response] {.async.} =
var client = newAsyncHttpClient()
var st = s.strip()
if not s.startsWith("http"):
st = "http" & "://" & s
return await client.get(st)
proc all(s: seq[Future[Response]]) {.async.} =
var retFuture = newFuture[void]("all")
retFuture.complete()
for i in 0..len(s)-1:
retFuture = retFuture and s[i]
await retFuture
proc main() {.async.} =
let paths = @["127.0.0.1:8080/hello",
"127.0.0.1:8080/file",
"127.0.0.1:8080/form/321"]
var rest = newSeq[Future[Response]](paths.len)
for i in 0..len(paths)-1:
rest[i] = asyncGet(paths[i])
await all(rest)
# for i in rest:
# echo i.read().body
waitFor main()
Another one version:
import asyncdispatch,
httpclient, strutils
proc asyncGet(s: string): Future[Response] {.async.} =
var client = newAsyncHttpClient()
var st = s.strip()
if not s.startsWith("http"):
st = "http" & "://" & s
return await client.get(st)
proc all(s: seq[Future[Response]]) {.async.} =
var retFuture = newFuture[void]("all")
var counter = len s
for i in 0..len(s)-1:
s[i].callback =
proc () =
counter.dec
if counter == 0:
retFuture.complete()
await retFuture
proc main() {.async.} =
let paths = @["127.0.0.1:8080/hello",
"http://127.0.0.1:8080/file",
"127.0.0.1:8080/form/321"]
const cnt = 100
var rest = newSeq[Future[Response]](cnt)
for i in 0..cnt-1:
rest[i] = asyncGet(paths[0])
await all(rest)
# for i in rest:
# echo i.read().body
waitFor main()
In that implementation there is some trouble. When cnt is equal to 100 then everything is ok. Code executes fast (~20 msec).
But when cnt is more than 100 (200 (~1s), 300 (~1.5s) and so on) then code executes very slow (50-100 times slower).
Any idea, please? PS. Checked server side. Server load doesn't increase over 7%.
@karatin - remove the await calls in asyncGet and all procs.
The only await calls should be done in main().
It would look like this:
import asyncdispatch,
httpclient, strutils
proc asyncGet(s: string): Future[Response] =
var client = newAsyncHttpClient()
var st = s.strip()
if not s.startsWith("http"):
st = "http" & "://" & s
result = client.get(st)
proc all(s: seq[Future[Response]]): Future[void] =
var retFuture = newFuture[void]("all")
var counter = len s
for i in 0..len(s)-1:
s[i].callback =
proc () =
counter.dec
if counter == 0:
retFuture.complete()
result = retFuture
proc main() {.async.} =
let paths = @["http://google.com",
"http://127.0.0.1:8080/file",
"127.0.0.1:8080/form/321"]
const cnt = 100
var rest = newSeq[Future[Response]](cnt)
for i in 0..cnt-1:
rest[i] = asyncGet(paths[0])
await all(rest)
for i in rest:
echo i.read().body
waitFor main()
With this thread being 5 year olds now, has there been any updates in async requests?
Im trying to replicate the following python code which makes requests asynchronously and returns a list off all the requests content:
async def async_get(url, session):
async with session.get(url) as response:
return await response.text()
async def all(urls):
session = get_session()
ret = await asyncio.gather(*[async_get(url, session) for url in urls])
await session.close()
return ret
Some of the old solutions here are quite nasty looking, missing something like asyncio's gather function.
Presumably the individual request function would look something like this:
proc asyncGet(url: string): Future[string] {.async.} =
var client = newAsyncHttpClient()
result = await client.getContent(url)
So this works
import asyncdispatch
import httpclient
proc asyncGet(url: string): Future[string] {.async.} =
var client = newAsyncHttpClient()
result = await client.getContent(url)
proc main() {.async.} =
let urls = @["https://www.google.com/"]
var futures = newSeq[Future[string]](len(urls))
for i, url in urls.pairs():
futures[i] = asyncGet(url)
var contents = await all(futures)
waitFor main()
But I would like to move it into a function that takes a seq of urls, just not sure what the correct return type would be.
Yeah I just figured it out before you posted, typical.
import asyncdispatch
import httpclient
proc asyncGet(url: string): Future[string] {.async.} =
var client = newAsyncHttpClient()
result = await client.getContent(url)
proc asyncGetAll(urls: seq[string]): Future[seq[string]] {.async.} =
var futures = newSeq[Future[string]](len(urls))
for i, url in urls.pairs():
futures[i] = asyncGet(url)
return await all(futures)
proc main() {.async.} =
let urls = @["https://www.google.com/"]
var contents = await asyncGetAll(urls)
waitFor main()