Manual transfer of Hugo generated files to S3

Hi,

Is there a way to generate website files locally and transfer (upload) them to S3 manually (using AWS console)?

If so, how do you generate the files (is there a Hugo command for this)?

Thank you,

I see that hugo -D does this, thanks

The “-D” flag is to built drafts, Just hugo will build your site.

See the docs or run hugo --help for more options.

2 Likes

Thanks @frjo

I was expecting to see html files in the created ‘public’ folder, but, I see xml files.

Have you run into this before?

Thank you

The command hugo should create a complete website with all html and whatnot.

I use Visual Studio Code (great software!) and upload all files to the server with the SFTP extension by liximomo. My 2000 page site takes like 5 Minutes to upload.

1 Like

The RSS feeds are xml but most pages are html together with whatever, images, css and js etc. you have.

If the site does not build correctly with hugo you have a separate problem that you need to fix first.

2 Likes

@frjo, @baker thank you! I was able to get html generated. I uploaded the files and folder located in the Public folder to AWS S3. Now, the page loads from AWS but without any of the images or page layout. I think I will have to somehow deploy to AWS using command line (and I can’t just grab the files and upload them to S3)?

Have you set the correct baseURL? This is really important.

Have you followed Hosting a static website using Amazon S3 - Amazon Simple Storage Service?

The content of “public” dir after you have successfully run hugo command is your complete site. Have you upload it does not matter.

2 Likes

Thank you @frjo, it is working now

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.