I’m new here and I’m currently looking around at prospective solutions to the issues that my businesses site is experiencing when passed through google’s pagespeed insights, gtmetrix and other performance measurement platforms before I invest time in going down a specific path that may not yield the results that we require.
I’m hoping that someone in the community may be able to shed just a bit of light on whether Hugo might be a possible solution to some, or all, of the challenges that we are facing.
Currently our subdomain is scoring very poorly on mobile performance, and seems to rank anywhere from 19% to 35%. We cannot understand the reason for this inconsistency in the scoring as it changes on each pass, but overall the score is always under 40%.
With recommendations from pagespeed insights and gtmetrix (to name but two) ranging from the reduction of unused javascript and the elimination of render-blocking resources to avoiding enormous network payloads and serving static assets with an efficient cache policy, it appears to be pretty much impossible to address some of these complaints in the context of our fully static, hand-coded HTML, CSS and JS website based around Bootstrap 4.0.
With this concept of SSGs or static site generators having been suggested to me on several occasions as a solution to some of these problems, I am just trying to understand whether something like Hugo would provide any kind of path to addressing these page speed performance issues if we were to effectively rebuild our site using this as a baseline.
As I understand it the introduction of a CDN is also something that may contribute to an improvement in pagespeed performance. This is something which i am trying to obtain a deeper understanding of via conversations with CloudFlare and Azure.
I’m well aware that there is no fully comprehensive silver bullet to the penalties issued by performance indicators such as pagespeed insights. Having said, that I have been told several times now by more than one developer that i would most definitely benefit from the use of an SSG.
I would truly appreciate any helpful insights from folks out here who may have experienced similar issues, and / or have had to address these issues whilst using Hugo as a framework - or even redeveloped in Hugo in part to address such issues.
I would recommend rebuilding your site, using a starter like e.g Hyas. I created Hyas to have a solid starting point for building modern websites that are secure, fast, and SEO-ready out of the box. So, with A+ scores on Mozilla Observatory and 100 scores on Google Lighthouse.
@frjo is right. The problem is your site and you need to first follow the guidelines that the speed measuring tools provide to optimize it. Hugo can help in some ways (like the unused CSS which you can extract with Hugo Pipes). But you need to rethink how your site is designed. The efficient cache policy is a server side thing though. I would advise you try to move your site to Hugo and develop it locally. Then deploy it as a test site and compare with your current site. From there, you can follow the tips given by the speed testing sights and optimize accordingly. But beware Hugo has a learning curve.
I would love to optimize a fully static, hand-coded HTML, CSS and JS website based around Bootstrap 4.0. I know how to get a 100% Google Lighthouse score. Feel free to send me a DM!
Having now rebuilt the corporate website referenced in this post’s original question set I can guarantee you that it is not possible to achieve 100% scores across all of the google core web vital markers whilst requiring usage of standard measurement metrics including GTM, GA and HS.
The statement you make on your domain of ’ simply use less Javascript and CSS and when I say less… I mean significantly less’ is not simple at all. Either you do not operate within the realm of marketing, or you hold disproportionate sway over that department wherever it is that you work or are employed.
No marketer on earth will submit to a statement like ‘just use less javascript’ when it comes to collecting the metrics they believe they require in order to demonstrate ROI. A more accurate statement would be, ‘learn to manage expectations of those in your organisation who demand performance but do not understand how to achieve it’. This creates the groundwork for a conversation with the c-suite and marketing to educate the individuals in both (or more) teams around what they can realistically achieve if they are stuffing reams of tags and tracking code onto any one HTML page.
I know. I have just finished a twelve month project to achieve just this.
the problem was not the framework. the wording used on this platform’s promotional docket is misleading. the ‘fastest framework’ tagline should read, ‘fast for developers’. not for performance metrics in search; because it is not. small helpers such as the unused css functionality can provide some relief, yes, but that is a limited salve to the challenge of the core web vitals.
This project was in any case rebuilt using the later framework release, Bootstrap 5.2. which has in turn yielded CWV scores of anywhere between 80% and 89%, on a page by page basis.
No need for funky convoluted frameworks massively overcomplicating what should be and is a simple build.
The CSS and JS in these links are 270kb large (combined) and are (probably) blocking during load. Processing that much CSS and JS takes a long time. If you focussed on performance you probably reduced the size of these files by removing unused components. You should aim for about 14kb of CSS and 1 to 2kb of JS, as described in my article How to get a 100% Google Lighthouse score. Not (just) because your site is otherwise too heavy, but mostly because these files have to be parsed and executed. I am very curious if you came anywhere close to those metrics at all. If not… I would love to look at those thousands of lines of (unused) JS and CSS and help you reduce it.
After the page has loaded (upon the first interaction = scroll, click, mousemove, etc) you can load all your marketing JS stuff, as described in my article Monkey business with banana leafs. This will NOT count or negatively influence your performance (until the new Interaction to Next Paint metric drops AND Google looks at RUM instead of synthetic results). I know that this is technically kind of cheating… but it works like a charm when you want to trick Google AND get instant pageloads.
Again… I would love to help you. And for the record: I work mainly for advertising agencies.
Simple might be a matter of taste. I think that using Bootstrap is very complicated… while you might find it simple. I think that hand-coding HTML and CSS is simple, as it involves very little code. A good example of that are these single file websites: https://www.zengardenwebsites.com. We could discuss simple, but I do not think that that is the point here.
As for the ‘fastest’ claim: it only applies to the build speed, nothing else. I thought that was clear to everybody… but apparently not. Hugo is not even the fastest, but that is for another topic. The speed at which you can build your website does not influence how fast it loads. Every SSG is equal when it comes to performance, as long as it produces a static website (HTML and CSS), and not a dynamic or JS driven one.
This means that for example Hugo and Jekyll can be equally performant, while the Hugo site takes just a few seconds to rebuild and the Jekyll website several minutes. To illustrate this: I am currently in the process of porting a lot of Jekyll websites to Hugo and their CSS and HTML stays exactly the same (thus so is their performance in Lighthouse). A fun fact: One of these websites was building on Github Pages (using Github Actions) with Jekyll and took about 20 minutes to rebuild. With Hugo on my own server I can rebuild the website in seconds.
If you have any questions, or if you would like me to review your website (privately), I would be happy to do so. Just send me a message.