Content pfp
Content
@
https://warpcast.com/~/channel/programming
0 reply
0 recast
0 reaction

Ryan pfp
Ryan
@ryansmith
Have any of you had experience running SMCI? If so would you do it again?
3 replies
2 recasts
12 reactions

K pfp
K
@kijijij
Nope. Scrapping web and running big data workload ?
1 reply
0 recast
0 reaction

Ryan pfp
Ryan
@ryansmith
Assuredly. In possession of 10TB at the moment.
1 reply
0 recast
0 reaction

K pfp
K
@kijijij
I would use parquet to reduce storage. streaming data to queries would be interesting challenge. best of luck !
1 reply
0 recast
0 reaction

Ryan pfp
Ryan
@ryansmith
what's the advantage of parquet? i'm already compressing with zfs and lz4.
1 reply
0 recast
0 reaction

K pfp
K
@kijijij
If I have ton of columns in table and when while reading if I read only few columns then Parquet is useful. It will retrieve from column format storage and will decompress only required "cell" ( row x column ). You can use LZ4 with Parquet to compress. Parquet is useful for fast scanning and compression both. You might be unpacking full LZ4 to read the bytes, if so parquet can help. You can consider * Parquet * Avro Rather storing data in DB and scanning all table columns ( row ) unnecessarily.
1 reply
0 recast
0 reaction