Support is basically telling me that the inherit process on the server can’t handle my "too large backup state’s indexes", and that I’m gonna have to start afresh and reupload 2TB or so from scratch. Not only that, but in order to even be allowed to do this, I’m gonna have to free up my license first by deleting my existing backup, and accept being left without one during the weeks this will take. You’ve gotta be kidding me ?
And I’m this situation in the first place not even because I have a new computer I want to inherit the backup, but because support themselves suggested that I nuke my local backup state and restore it using the inherit feature, to work around the installer being unable to upgrade me to the latest 6.1 release. That’s how it all started.
No apologies given either, no offer to investigate further to avoid that very inconvenient outcome. I’ve been a customer for 7 years but honestly, if I can’t trust your systems to handle the "too large" 2-3TB backup of my MacBook and a couple of external HDDs and I’m supposed to accept that, that suddenly makes me less confident in everything else you do. I hope I misunderstood something here, but I don’t think I did.
EDIT: Ok, reading the email again I may have misunderstood the part where I need to delete my existing backup first. It looks like I can use the 14 days trial to start a new backup, then transfer the license over before the end of the trial. It’s still pretty bad because there’s no way I‘ll have those 2-3TBs reuploaded within 2 weeks, and of course it’s still a massive inconvenience to do so.
> inherit process on the server can’t handle my "too large backup state’s indexes"
Yeah, I'm sorry about that and it is on my plate to fix it. The issue is that Backblaze uses a ZIP library that only handles up to 4 GByte zip files. We zip up your "backup state" (the list of files that were backed up) and download it to the client that is "Inheriting". Your "backup state" has exceeded 4 GBytes, which is unusual, but not unheard of (maybe 2% of our customers right now). And it is "on the rise" as more and more customers have more data, plus backup for longer and longer with Backblaze.
The fix is to either link with a new zip library, or sub-divide your "backup state" into 2 or 3 or 4 zip files. Easy enough, but it doesn't help you this week.
> No apologies given either
I am sorry about this. In our support tech's defense, they were CRUSHED this past week by our attempts to get everybody auto-updated in anticipation of the Macintosh OS X 10.15 Catalina release which came out yesterday. So if they were terse it wasn't out of disrespect, it was out of sheer load they were dealing with.
The issue was that if we didn't get everybody to upgrade, anybody that chose to install Macintosh OS X 10.15 Catalina then it would break Backblaze and popup a completely random error dialog that wasn't helpful and didn't solve the problem. By getting people upgraded before Catalina, there are no issues at all and they won't have to contact support.
> It’s still pretty bad because there’s no way I‘ll have those 2-3TBs reuploaded within 2 weeks
To maximize your chances, make sure you turn off all power savings modes on your computer (like don't even let the monitor go to sleep) and make sure Backblaze is set to use 30 threads, and give it LONG periods of time (overnight 8 hours is ideal). You should be able to backup 1 TByte every 24 hours or so, but it can go slower if you don't have an SSD drive, or if your bandwidth is limited. One idea is to take your computer to a location with faster bandwidth, like a school or your work place and leave it there for two or three days, then carry it back home for the incrementals. By the way, Comcast has announced that it literally intends to offer full 1 Gbit/sec service to every last customer in the United States, so another way to go is upgrade your internet for 1 month, then downgrade it later.
> can’t trust your systems to handle the "too large" 2-3TB backup
We can handle the backups, just not the "Inherit" feature for long running backups with large amounts of data. And it's a fairly straight-forward fix for me to fix that, I just need to get to it. I'm sorry you got bitten by this short-coming, I will get it fixed.