- var
- NOM: int
NOM = ((d2.year - d1.year) * 12 + ord(d2.month) - ord(d1.month))
- if d2.monthday < d1.monthday:
NOM -= 1
The application take around 45s to run on a specific count of records on linux OS.
Alongside I have the same application which does the same calculation using text fields without converting them into Dates. Everything else is the same otherwise. An example is given below.
- var
NOM: int
dat1, dat2: seq[string]
sep : string
sep = if contains(d1,"-"): "-" else : "/"
dat1 = split(d1,sep)
sep = if contains(d2,"-"): "-" else : "/"
dat2 = split(d2, sep)
NOM = (dat2[2].parseint - dat1[2].parseint) * 12 + dat2[1].parseint - dat1[1].parseint
- if dat2[0].parseint < dat1[0].parseint:
- NOM -= 1
return NOM
This application take 25s to run on the same data on linux OS on the same computer. Both are compiled using similar compiler flags.
Another thing I have noticed that when both of these applications are run on windows or MAc, they take approx the same time.
I would like to understand the reasons for difference in speed on linux. Why the use of Times library is causing increase in run times on linux OS.
AC