![]() ![]() ![]() There’s something in the way that database is (re)built that’s slow as molasses once you have a lot of files and versions I don’t know anything about the internals, so I have no idea what it is and if there is any hope that one day it could be improved. I happened to test a restore not long ago, after I’ve had Duplicati running for several years, and discovered this unfortunate limitation. If you have a way of monitoring task resource use, I suspect you’ll find that it is CPU-bound… the thing that seems to take so long is rebuilding the SQLite database from the remote files, and it’s apparently a single-threaded process. I also copied all the backup files to a local disk in hopes that the restore/rebuild would speed up a bit, but it doesn’t look like it makes much of a difference. It looked like I had no other choice, but to delete and rebuild the database, which I did and it is running now for a day or two and now again “stuck” at 90% I tried to run a repair and get the same error as when I tried the restore: Unexpected number of remote volumes detected: 0! source/appdata/sonarr/logs.db, actual size 5320704, dbsize 0, blocksetid: 154919Īt .LocalDatabase.Verif圜onsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, transaction) in :0Īt .TestHandler.Run (System.Int64 samples) in :0Īt +c_Displa圜lass30_0.b_0 ( result) in :0Īt .RunAction (T result, System.String & paths, & filter, System.Action 1 method) in :0 at .RunAction (T result, System.Action1 method) in :0Īt .Test (System.Int64 samples) in :0 System.IO.InvalidDataException: Found inconsistency in the following files while validating database: ![]() I tried to verify the files and get the following error:.System.Exception: Unexpected number of remote volumes detected: 0!Īt .LocalDatabase.UpdateRemoteVolume (System.String name, state, System.Int64 size, System.String hash, System.Boolean suppressCleanup, System.TimeSpan deleteGraceTime, transaction) in :0Īt .LocalDatabase.UpdateRemoteVolume (System.String name, state, System.Int64 size, System.String hash, System.Boolean suppressCleanup, transaction) in :0Īt .LocalDatabase.UpdateRemoteVolume (System.String name, state, System.Int64 size, System.String hash, transaction) in :0Īt .FilelistProcessor.RemoteListAnalysis ( backend, options, .LocalDatabase database, log, 1 protectedFiles) in :0 at .FilelistProcessor.VerifyRemoteList ( backend, options, .LocalDatabase database, log, 1 protectedFiles) in :0Īt .FilelistProcessor.VerifyRemoteList ( backend, options, .LocalDatabase database, backendWriter, System.Boolean latestVolumesOnly, transaction) in :0Īt .RestoreHandler.DoRun (.LocalDatabase dbparent, filter, result) in :0Īt .RestoreHandler.Run (System.String paths, filter) in :0Īt +c_Displa圜lass15_0.b_0 ( result) in :0Īt .RunAction (T result, System.String & paths, & filter, System.Action`1 method) in :0Īt .Restore (System.String paths, filter) in :0Īt (+IRunnerData data, System.Boolean fromQueue) in :0 I tried a restore (via the backup job) and I get this error (I tried different restore points):.Now I thought this doesn’t look too bad and Duplicati also shows my old backup job. Is there anything I can do about the restore speed? Am I maybe just misinterpreting the logfiles and it doesn’t need all 5221 files? It’s downloading to an SSD and restoring to an HDD. I doubt that duplicati is limited by CPU, RAM, I/O or network speed. I read about a bug concerning the restore, but that should be fixed in my version, right? So if it’s really restoring those ~5000 files at 5min/file I would have to wait a while… The progress sits now at about 90% and it’s restoring the dblock files (I think):ġ1:26 AM: Backend event: Get - Completed: (49.92 MB)ġ1:26 AM: Pass 3 of 3, processing blocklist volume 199 of 5221ġ1:26 AM: Backend event: Get - Started: (49.92 MB)ġ1:21 AM: Backend event: Get - Completed: (49.94 MB)ġ1:21 AM: Pass 3 of 3, processing blocklist volume 198 of 5221ġ1:21 AM: Backend event: Get - Started: (49.94 MB)Īnd 1 file needs about 5 minutes for completion. I was running weekly (encrypted) backups for the last couple of months and now I lost all my docker containers and need to restore a VM.Ībout 24h ago I created a new Duplicati container and just started a restore (restore - direct restore - …). Stuff to restore: a couple container configuration files and 1. ![]() My version: linuxserver.io Docker container, v2.0.6.3_beta_ ![]()
0 Comments
Leave a Reply. |