What's the best way to match a standard file into a single data structure for editing it?
The intention is to be able to read and write a data structure from any standard and generic file that contains a descriptor header (that includes the size of the data) and a variable size data section, by using a single data structure, and developing a package to make it easy to create file editors and converters.
A good example would be to read a .bmp or a .wav file, do some work on it like filtering it, editing it, changing its resolution and size, and converting it into another type of file such as a .gif or an .mp3.
Note: I'm not sure if endianness may make the data structure platform dependent, but in that case, it can still be resolved.
Many Nim library handle generic Json serialization. https://github.com/treeform/jsony/ is my favorite, but if you just search for Json using NIm language on github you have multiple choices.
You can also use existing serialization protocol like Protobuff or messagepack for instance.
Finally, if you store binary / numerical buffer you can use HDF5. There is a great Nim library for it https://github.com/Vindaar/nimhdf5.
Not sure what a single structure is, FileType below?:
type
FileHeader = object
# stuff about the file construction
DataHeader = object
#stuff about the data construction
MetaData = object
#metadata about the filedata
FileData = object
data: seq(s)[of dataobjects/items of some type]
FileType = object
fileheader: FileHeader
dataheader: DataHeader
metadata: MetaData
filedata: FileData
For an intermediate file format one could have a look at RIFF files (wave files are riff files)
For complex, large things I prefer an in memory SQLite db as it at least gives me a standard way to query data.
I've been using https://github.com/sealmove/binarylang to (de)serialize binary blobs/protocols
it handles endianess and generates data structures
blobA > (deserialize) ObjA > proc toObjB > (serialize) ObjB