Only the 1. step is missing.
I am trying with httpclient, using SSL (that's enabled in the config; it works at run-time).
code:
import strformat
import httpclient
import os
import macros
proc staticDownload*(file: string, link: string, overwrite: bool = false) =
if os.fileExists(file) and not overwrite:
error(fmt"Not downloading '{link}'; File '{file}' already exists; leaving it as is. Use --force to overwrite.")
return
var client = newHttpClient() # Real problems is either here ...
try:
var file_h = open(file, fmWrite)
defer: file_h.close()
file_h.write(client.getContent(link)) # ... or here
echo(fmt"Success - downloaded '{link}' to '{file}'.")
except IOError as err:
error(fmt"Failed to download '{link}' to '{file}': " & err.msg)
macro dl(url: static[string]): untyped =
let tempExtsFile = os.joinPath(os.getTempDir(), "some_file_" & "123" & ".txt") # TODO HACK Need to randomize somehow, but can't user rand() at compile-time; maybe fetch time?
staticDownload(tempExtsFile, url)
dl("https://raw.githubusercontent.com/hoijui/file-extension-list/master/data/code")
Error:
/home/user/Projects/tool/src/static_dl.nim(23, 3) template/generic instantiation of `dl` from here
/home/user/.choosenim/toolchains/nim-#devel/lib/pure/httpclient.nim(335, 12) Error: cannot evaluate at compile time: defaultSslContext
Can it be done, or do I have to run a script downloading the file before running nimble?
As far as I know there is no way(aside from having a compiler built with libcffi) to do web requests on the VM, so you'll need to use a system provided program like curl or wget on linux(no clue for windows) as such the follow does work for most linux assuming they have curl
import std/[sequtils, strutils]
const
Url = "https://raw.githubusercontent.com/hoijui/file-extension-list/master/data/code"
data = static:
let res = staticExec("curl -s '$#'" % Url)
toSeq(res.splitLines)
echo data
ahhh thank you! after reading your post I though.. hmm... that slowly starts to look kind hackish (as alreayd was my original idea).
I will use an alternative way of reaching my goal: I will maintain a separate git repo, containing the files I need at compile-time in this project. That repo could update its self in a daily scheduled CI run, where a script downloads all the files from their original sources, and if any of them changed, creates a commit. This repo is then used a sub-module in this nim project repo, so I can use all these files at compile-time, having them around as local files. This approach also improves reproducibility of builds and.. I guess is just generally cleaner (even though it is more work to setup and keep running). alternatively to this, I could just include these files directly in the nim project repo, and update them manually every once in a while, with a download script included in the repo. That also improves reproducibility, and is less overhead.