1 Jul
2021
1 Jul
'21
10:50 a.m.
Does anyone know of good solutions to this issue?
I'm struggling to find a way to preserve data across corrupt systems damaging it and harddrives that break frequently. This is very hard for me to solve.
I'm basically running off a raspberry pi right now. I don't have a lot of finances. I do have a number of external drives, but they tend to break faster than I can recover my files from them, resulting in a big queue of "to image/analyse" . I've been using git-annex, but I'm running into the issue that `git annex repair` seems to wipe my git history when I run into object corruption. What kinds of solutions exist already for large hash-addressed data integrity with history, that will work when the storage media are failing quickly?