home.social

#lzma2 — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #lzma2, aggregated by home.social.

  1. Ich archiviere immer mal wieder größere Dateimengen in Archiven, nutze dafür #7zip (mit #LZMA2 und Ultra-Compression). Mit der Option "qs" und einer Dictionary Size von 1024MB werden die Archive - je nach Dateien - nochmal erheblich kleiner.
    Unübertroffen ist aber #ZPAQ (dank Dedup-Algorithmus) - unglaublich, wie klein die Archive werden (Compression UND Decompression sind aber CPU- und zeitintensiv). ZPAQ hat per se keine GUI, ist aber in #PeaZIP integriert.

  2. Any experts here?

    I compressed an archive of nearly identical files (snapshots of the list of my tabs) in two ways.

    First, as .tar.zstd, and that gave me 325MiB.

    Then, I used with , and got only 35MiB. Much closer to what I expected.

    What gives? Is stringing files together in a way that prevents from detecting duplicate data across files?

  3. #mastoadmin The increase of backup sizes start to make me worried.

    I'm the only user on my instance, and I follow 189 people.

    No remote assets are added, just the #postgres db and assets of #mastodon . These files are compressed with maximum #7zip compression level, and have #lzma2 enabled #encryption .

    Backups are not #incremental . I do full #backup of my #docker instance daily.

    The instance runs at a 120gb ssd local server.

    I'll keep on monitoring this, but this increase is absurd.