Hi,
I have just started out with Nim (and am quite new to programming in general), and have really be enjoying the language so far. I am having a problem with some code designed to read from a .csv file. The strange thing is that it compiles and runs fine under Ubuntu, but gives a SIGSEGV error when compiled and run under Windows 10 (in both cases using the default flags to compile).
Basically, I am trying to build a 2d map, and populating it with entries from a .csv file. The first row of the file gives the dimensions of the map, while the rest show what is in each xy location.
Here is the main routine:
import parsecsv
import os
include miscFunctions
proc build_map()
proc build_map() =
let mapDimensions = readCSV("../maps/densityMap.csv", 0, 0, 1)
let mapX = mapDimensions[0].parseInt()
let mapY = mapDimensions[1].parseInt()
var densityMap = newSeq[seq[string]](mapX)
for i in 0 .. < mapX:
densityMap[i].newSeq(mapY)
### crashes at x = 13, y = 12 of the following: ###
for y in 0 .. < mapY:
for x in 0 .. < mapX:
let sectorDensity = readCSV("../maps/densityMap.csv", y + 1, x)
densityMap[x][y] = sectorDensity[0]
build_map()
And here is the .csv reader:
import parsecsv
import strutils
proc readCSV(fileName: string, rowNum: int, colMin: int, colMax: int = 0): auto
#[
# Function: readCSV
# -----------------
# Parameters: fileName: path to .csv file to be opened
# rowNum: row to be read (first row = 0)
# colMin: index of first column to be read (first column = 0)
# colMax: optional index for last column to be read, else is automatically set to 0 or colMin, whichever is greater
#
# Returns: string sequence with contents of columns from colMin to colMax, inclusive
# NOTE: returned tuple needs to be unpacked by caller, and parsed into appropriate type (from string)
]#
proc readCSV(fileName: string, rowNum: int, colMin: int, colMax: int = 0): auto =
var p: CsvParser
try:
discard open(fileName)
except:
raise newException(IOError, (fileName & " not found"))
p.open(fileName)
for i in 0 .. rowNum:
try:
discard p.readRow
if i == rowNum:
if colMax < colMin:
return (p.row[colMin .. colMin])
else:
return (p.row[colMin .. colMax])
except:
raise newException(IOError, "Row not found")
p.close()
If I use echos to track the .csv reading, it makes it to x = 13, and y = 12 before giving the error. And it does so even if I change it so that it reads the same cell for all map coordinates (i.e. by changing:
let sectorDensity = readCSV("../maps/densityMap.csv", y + 1, x)
to
let sectorDensity = readCSV("../maps/densityMap.csv", 0, 0, 1)
I think it might be some sort of memory issue, because even before it crashes, the windows built .exe runs very slow, compared to the linux build (as judged by echoing sectorDensity[x][y] as the .csv entries are being added).
Any help with this would be greatly appreciated!
bluenote: I'm not sure? Is the entire file read into memory when it is open, or is each row read in separately using readRow? Either way, I could read an entire row at a time, and loop through all the x'es in that row in one call of readRow, that's true.
Edit: Oh, Actually I see what you mean - I am rereading the rows every time as well.
I'm not sure how to fix that though, since I want the csvRead procedure to be generalized, and so I want to build the map in a separate procedure. I guess I could make the csvRead procedure build and return a sequence based on the arguments specified though, but I am not sure that is the best way to solve the problem (and in this case at least, the files are only 20x20 at max, so I don't think it is too bad anyway).
Hi! Nice to hear you are enjoying nim.
I think I would do something like this:
import parsecsv
from os import fileExists
from strutils import parseInt
type Map = object
w,h:int
map: seq[seq[string]]
proc mapFromCSV(fileName: string): Map =
var p: CsvParser
if not fileExists(filename):
raise newException(IOError, (fileName & " not found"))
p.open(fileName)
discard p.readrow()
# TODO: ...check if p.row is empty...
result.w = p.row[0].parseint
result.h = p.row[1].parseint
result.map = @[]
while p.readRow():
result.map &= p.row # append the row to our map.
p.close()
And then use this map object to build another map.
But if you want to keep it closer to what you aldready have, maybe something like this:
import parsecsv
from os import fileExists
proc readCSV*(fileName: string): seq[seq[string]] =
var p: CsvParser
if not fileExists(filename):
raise newException(IOError, (fileName & " not found"))
p.open(fileName)
result = @[]
while p.readRow():
result &= p.row
p.close()
would be better, and then use it like this:
from strutils import parseint
import <file that reads csv> # I prefer import instead of include
# keeping the Map object from the first example
proc build_map():Map =
let mapseq = readCSV("../maps/densityMap.csv")
result.map = @[]
for y,row in mapseq:
if y == 0: # first row, so read dimensions
result.w = row[0].parseInt()
result.h = row[1].parseInt()
else:
# if you don't need to transform the map you read, just do result.map &= row
result.map &= newSeq[string](row.len)
for x,el in row:
result.map[y][x] = el
var mymap = build_map()
( on mobile so I couldn't run it, should work though. ) btw: what do you mean with 'generalized'? If you want different return types I'd try overloading readCsv or genericsWhat I meant by generalized, is that I want to use the readCSV proc for reading any number and format of entries I want - so sometimes a single entry, given at a specific location, sometimes a 2d sequence with the dimensions held within the file. Basically, to have the output format and location of the entries to be read specified by the caller.
I can definitely adapt what you have towards that design goal though, and make the procedure as a whole more efficient.
One question though - is there a reason you don't like to use include?
Thanks!