Sorry but I can’t think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn’t hate to hear it.

I’m trying to set up a home server for all of our family photos. We’re on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to “prepare” the download. Then you have one week before the takeout “expires.” That’s one week to the minute from the time of the initial request.

I don’t have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn’t let you download the entire archive either, you have to select each file part individually.

I can’t tell you how many weeks it’s been that I’ve tried to download all of the files before they expire, or google gives me another error.

4 points

Well, obviously they don’t want you to!

permalink
report
reply
112 points

There’s no financial incentive for them to make is easy to leave Google. Takeout only exists to comply with regulations (e.g. digital markets act), and as usual, they’re doing the bare minimum to not get sued.

permalink
report
reply
11 points
*

Or why is Google Takeout as good as it is? It’s got no business being as useful as it is in a profit-maximizing corpo. 😂 It can be way worse while still technically compliant. Or expect Takeout to get worse over time as Google looks into undermaximized profit streams.

permalink
report
parent
reply
25 points
*

Probably because the individual engineers working on Takeout care about doing a good job, even though the higher-ups would prefer something half-assed. I work for a major tech company and I’ve been in that same situation before, e.g. when I was working on GDPR compliance. I read the GDPR and tried hard to comply with the spirit of the law, but it was abundantly clear everyone above me hadn’t read it and only cared about doing the bare minimum.

permalink
report
parent
reply
7 points
*

Most likely. Plus Takeout appeared way before Google was showing any profit maximization signs and didn’t even hold the monopoly position it does hold today.

permalink
report
parent
reply
20 points

It doesn’t have an option to split it?

When I did my Google takeout to delete all my pics from Google photos there was an option to split in like “one zip every 2gb”

permalink
report
reply
10 points

The first time I tried it in the two gigabyte blocks. The problem with that is I have to download them one or two at a time. It’s not very easy to do over the course of a week on a normal internet connection. Keep in mind, I also have a job.

I got about 50 out of 60 files before the one week timer reset and I had to start all over.

permalink
report
parent
reply
10 points

You could look into using a download manager. No reason for you to manually start each download in sequence if there’s a way to get your computer to automatically start the next as soon as one finishes.

permalink
report
parent
reply
2 points

Any recommendations? Windows or Linux?

permalink
report
parent
reply
13 points

Apparently you can save it to Google drive then download the Google drive program and make that folder available offline so it downloads it to the computer.

  1. When you setup the Google Takeout export choose Save in a Google Drive folder

  2. Install the Google Drive PC client (Drive for desktop)

  3. It will create a new drive (i.e. G:) in your explorer. Right click on the takeout folder and select “Make available offline”. All files in that folder will be downloaded by the Google Drive Desktop in the background, and you will be able to copy to another location, as they will be local files.

permalink
report
parent
reply
4 points

Have you tried mounting the google drive on your computer and copying the files with your file manager?

permalink
report
reply
12 points

From a search, it seems photos are no longer accessible via Google Drive and photos downloaded through the API (such as with Rclone) are not in full resolution and have the EXIF data stripped.

Google really fuck over anyone using Google Photos as a backup.

permalink
report
parent
reply
2 points

Yeah, they really want to keep your data.

permalink
report
parent
reply
5 points

Yeah, with takeout, there are tools that can reconstruct the metadata. I think Google includes some JSONs or something like that. It’s critical to maintain the dates of the photos.

Also I think if I did that I would need double the storage, right? To sync the drive and to copy the files?

permalink
report
parent
reply
1 point

From what I’ve read, I would not trust any process other than the takeout process. Do the album thing to split it up.

permalink
report
parent
reply
37 points

I know it’s not ideal, but if you can afford it, you could rent a VPS in a cloud provider for a week or two, and do the download from Google Takeout on that, and then use sync or similar to copy the files to your own server.

permalink
report
reply
3 points

I was gonna suggest the same.

permalink
report
parent
reply
10 points

I don’t know how to do any of that but I know it will help to know anyway. I’ll look into it. Thanks

permalink
report
parent
reply
11 points

Be completely dumb and install a desktop OS like Ubuntu Desktop. Then remote into it, and use the browser just as normal to download the stuff on it. We’ll help you with moving the data off it to your local afterwards. Critically the machine has to have as much storage as needed to store all of your download.

permalink
report
parent
reply
3 points

Instead of having to do an Operating system setup with a cloud provider, maybe another cloud backup service would work. Something like Backblaze can receive your Google files. Then you can download from Backblaze at your leisure.

https://help.goodsync.com/hc/en-us/articles/115003419711-Backblaze-B2

Or use the filters by date to limit the amount of takeout data that’s created? Then repeat with different filters for the next chunk.

permalink
report
parent
reply
2 points

Use this. It’s finnicky but works for me. You have to start the download on one device, then pause it, copy the command to your file server, then run it. It’s slow and you can only do one at the time, but it’s enough to leave it idling

permalink
report
parent
reply

Selfhosted

!selfhosted@lemmy.world

Create post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

Community stats

  • 3.4K

    Monthly active users

  • 1.6K

    Posts

  • 14K

    Comments