The sad thing about buying a car is that you need a license to drive it Having said that: What’s in your git repo is completely enough to deploy a new version of your website. You need no backup of every single file on your local installation, just the files to recreate the site. Those missing directories are created when Hugo runs and are not relevant to re-install it.
If you want a “backup” of your website, then back up the
public folder after running
Loading the whole project from your backup into a webspace (cloudspace, name it whatever) IMHO does not work, because of file rights in your backup (if you backup to FAT there are none, if you back up to NTFS it expects your Windows environment with it’s users, if you back up to EXT or other Linux formats your user has an ID that changes with each environment. FAT/NTFS/EXT are different file systems.). That’s completely normal
You typically checkout your repo local and then deploy to a cloudspace if they don’t offer an environment that lets you checkout via GIT and use Node, Go and so on.
Back to the initial question:
Is there a way to automatically backup the local directory to the cloud and use the same directory with a remote Git repo?
Yes. If the remote system is 100% identical to your local system. Same hard drive formatting, same setup of the users, same operating system, same installations of programs. You can’t guarantee that. Just add users in a different order and they will have different userIDs in the system. So you put the files required to “rebuild” the installation into Github and let the remote system install requirements for you.
Sidenote: Storage is not an operating system. You can’t run Hugo on pCloud or Dropbox. Moving them from there to a third system makes it even worse
Hugo is a motor in your car. It requires tires. Hugo does not care about the color and amount of seats. All it does is turning some wheel that again start turning tires. That’s why Hugo is so powerful. You can put it in any car. Just to stay with the car analogy