Hi guys, I'm typically playing with data of several GBs, be it JSON or CSV. It's something I do at home, with my own pc and a little server I have (no fancy hardware, just don't want hear the fans running at night so I've installed the server in the living room).
Now, I typically transform this data with Knime and PowerQuery because it's easy to work with, but they really struggle with the ~7Gb allocated for them. My sever has even less memory, and it's running other stuff.
I tried all sorts of tricks but it becomes very slow, or just cannot work because of the available ram (Knime). I also tried with R and, yeah I get better results but it removes the fast&easy element from it and it becomes a time sink. Same with python, where I'm even less proficient.
I've been wondering about loading that data into a mariadb or sql database, and transforming that data in the server. I know databases are meant to work with this sort of problems, and I already know a bit of SQL, but I'm just exploring this option, so I come here for advice before sinking a lot of time again on it, because I'll need a lot of transformations.