I know I can save the files on a local directory thats synced to the cloud e.g. Dropbox, pCloud etc. But this means I cannot create a Git repo in this folder as one cannot create a Git repo in a directory that exists both in the cloud and is mapped to my machine (thats my newbie experience anyway).
Is there a way to automatically backup the local directory to the cloud and use the same directory with a remote Git repo?
Unless I misunderstand, this statement is false. From the standpoint of a file system, any file system, your local git repository is just another directory.
Is there some reason you can’t “backup” by simply pushing to a remote repository (e.g., GitHub)?
thanks, I think I was wrong with what I said, however when I try using Hugo in this way it doesnt work.
I have both Git and Hugo installed on my machine, but when I try and create a new Hugo site inside the directory thats synced to pCloud (like Dropbox) I get a permission denied error
Regarding “backing up” to Github by pushing the repo to the remote I have tried this before but the entire hugo directory and all sub directories were not pushed to Github. Most directories were pushed but not all so I figured Github didnt accept some file types or something.
I guess I am wrong about this too! Would you know if this is how people “backup” their Hugo websites?
When I try and “backup” Hugo to Github, Github doesnt show all the files that are in my local Hugo directory. I created a test website as an example.
Here are the directories in my local Hugo directory
e.g. the data, resources and theme directories on my local machine are not found on the Github repo, even though my local machine output shows the push worked fine
The sad thing about buying a car is that you need a license to drive it Having said that: What’s in your git repo is completely enough to deploy a new version of your website. You need no backup of every single file on your local installation, just the files to recreate the site. Those missing directories are created when Hugo runs and are not relevant to re-install it.
If you want a “backup” of your website, then back up the public folder after running hugo locally.
Loading the whole project from your backup into a webspace (cloudspace, name it whatever) IMHO does not work, because of file rights in your backup (if you backup to FAT there are none, if you back up to NTFS it expects your Windows environment with it’s users, if you back up to EXT or other Linux formats your user has an ID that changes with each environment. FAT/NTFS/EXT are different file systems.). That’s completely normal
You typically checkout your repo local and then deploy to a cloudspace if they don’t offer an environment that lets you checkout via GIT and use Node, Go and so on.
Back to the initial question:
Is there a way to automatically backup the local directory to the cloud and use the same directory with a remote Git repo?
Yes. If the remote system is 100% identical to your local system. Same hard drive formatting, same setup of the users, same operating system, same installations of programs. You can’t guarantee that. Just add users in a different order and they will have different userIDs in the system. So you put the files required to “rebuild” the installation into Github and let the remote system install requirements for you.
Sidenote: Storage is not an operating system. You can’t run Hugo on pCloud or Dropbox. Moving them from there to a third system makes it even worse
Hugo is a motor in your car. It requires tires. Hugo does not care about the color and amount of seats. All it does is turning some wheel that again start turning tires. That’s why Hugo is so powerful. You can put it in any car. Just to stay with the car analogy
Sidenote: Storage is not an operating system. You can’t run Hugo on pCloud or Dropbox. Moving them from there to a third system makes it even worse
That makes sense now. I forgot the OS is not seen by these mapped cloud drives
You need no backup of every single file on your local installation, just the files to recreate the site. Those missing directories are created when Hugo runs and are not relevant to re-install it. If you want a “backup” of your website, then back up the public folder after running hugo locally.
So would it be fair to say there is not really a need to backup the public folder? Because as long as the remote Git is up to date with the local repo, then one could always just clone the remote Git repo and build the site again locally, e.g. in the event someone steals my laptop.
Before I was thinking I would need to backup both
public folder
the entire Hugo directory
and back them up separately. But it seems just making sure the remote Git matches the local one is enough
On websites that I touch maybe once a year I always have a backup of public around, because much can change in a year and I might have issues creating a new deploy. In that case and if I broke something on the live site I have the backup so the site is up and I can sit on the smaller issues preventing deploy. But in general you won’t need a public backup.
In terms of actually restoring the “backup” from the remote Git repo, should it be as simple as just cloning the remote repo to an empty directory on the local machine? I tried this but got a Hugo error about missing layout files. I actually tried twice in case I had made a simple mistake somewhere but both times I got the same error.
I feel like I am struggling with something really basic that everyone else understands instinctively
You need to create yourself a README.md in your repo to remember these steps when in some months you want to recreate the site. The problem is, that neither Git nor Hugo is automatically loading the theme in your theme folder. So you basically have to follow these steps:
@sientelo why are you running hugo with sudo ??? This is bound to create permissions problems, or am I horribly wrong on this?
You are probably right. I did it before because I didnt understand why hugo wasnt working when I ran it inside a drive mapped to the cloud, so I used sudo to see if it was some permission error. Then someone pointed out hugo is not installed on the drive mapped to the cloud, just the drive with my local OS on it, hence why hugo wasnt working there
thank you so much I have saved your comments in my notes, if theres one thing I am capable of its taking notes when I find the answer to something I previously struggled with, like this.
It is actually quite a relief to learn, I have spent way too much time the past few days trying to do the above. I almost felt like giving up on the static website idea and going back to Wordpress, but not anymore