
Instead, it is cheaper to store as a snapshot and re-create a new Droplet from a snapshot. In addition, even powered off Droplets are still charged. We don’t want the droplet running except the couple of minutes it takes to back up.
You should now be able to connect using x2go and get a remote desktop. The Host is your Digital Ocean droplet IP and the SSH key is the private key (lost several minutes because I accidentally put my public key here). Launch x2goclient and create a new session config. On ChromeOS, this is sudo apt install x2goclient in the Linux container. On your laptop/home computer install x2goclient/NoMachine. Finish the rclone config setup above with the application key details. Create an Application Key with read-write access to this bucket. Use the B2 website to create a bucket for your Takeout data. rclone config - Now follow the instructions to get B2 setup below. Copy your SSH public key to /home/nikhil/.ssh/authorized_keys. Update: I prefer nomachine over x2go, but that has to be downloaded as a. apt update & apt install rclone xubuntu-desktop x2godesktopsharing - this seems to also install Firefox. Create a Droplet with 4-8GB of RAM so we can run a desktop and a browser.
The machine is set up with XFCE, X2Go (I tried xrdp and it was really slow), and rclone. This was inspired by this post but I already had a Digital Ocean account, and find it a lot easier to use than AWS. Instead, this approach uses a remote machine to perform the sync.
The total data can be huge (~36GB for my last export), so downloading it to my local machine and uploading it can be slow (even on a Gigabit connection). Google Takeout allows exporting all your Google data, but requires a modern browser, with GUI and JS. Notes on using a Digital Ocean Droplet to copy Google Takeout data to Backblaze B2, for my own reference and in case they are useful to others.