Maybe you already knew this, but Netlify is better than AWS!

I’m new to this whole thing, and have learned the following:

  1. GitHub has a 1GB limit for hosting and thus, probably won’t work for me.
  2. AWS is terribly complicated, but even after you get it configured, pretty URLs are a pain! There’s something called Lambda which I fought with for hours today and couldn’t get working.
  3. Netlify gives 100GB of storage for free! It’s easy to upload - and works!

Seriously, I’m not sure why anyone would use anything else!!!

3 Likes

While I agree that Netlify is great, there can be cost savings with AWS if you go beyond the “free stuff”.

I just wish Amazon had a modern and easy to use interface for non techies.

3 Likes

Agree that AWS looks like it was designed by a hoard of the geekiest of geeks, given free reign to “design it however you want” resulting in the current mean xml soup.

But echoing what is mentioned above, I’m finding it super cheap (and am not having any trouble with pretty urls or indeed getting content up onto it thanks to s3deploy).

2 Likes
2 Likes

I’m using Firebase, more specifically I use Hugo + Gitlab(free private repos + free CI) + Firebase . I chose Firebase thanks to this comparison of Githib vs Netlify vs Firebase vs bunch of others static hosting providers. Firebase comes out on top because its fastest. I believe its 1 gb free hosting and 10gb free traffic, which is plenty for a personal website. One other big upside of Firebase compared to everything else is that it comes with CDN out of the box (provided by Fastly CDN, which firebase acquired some time ago) that has way better coverage then Netlify. AWS does not offer CDN out of the box. Firebase is also pretty cheap if you run out of free space/traffic.
Here and here are easy to follow tutorials on how to set it all up.

1 Like

Cool, but my website is larger than a gig due to lots of photos (transferring from Wordpress). GitHub has a max of 1GB, so it’s my understanding that I can’t connect my site to GitHub. Is there another alternative?

I use bitbucket for my personal site. It has 1 GB soft limit, 2 GB hard limit.

https://confluence.atlassian.com/bitbucket/what-kind-of-limits-do-you-have-on-repository-file-size-273877699.html

One solution to your image library problem is to host images separately on a dedicated image host like Cloudinary and imgix. But you are right, Netlify solves everything with free 100gb.

1 Like

We use Netlify for our sites, but S3 + Imgix for images.

2 Likes

These are excellent points. I think that individual users should avoid AWS, because it’s not ideal for simple situations. The only reason I can see to use it is if you want to learn more about using it. In this case, an all-in-one solution like Netlify is a good choice.

That’s for individual users. Small startups can also benefit from Netlify, and many do. My company is not big, but we’re already using AWS to support our SaaS product, so using AWS isn’t a problem. I can easily get engineering support for AWS. In addition, I want to use GitHub as a way to migrate to documentation as software.

I want that, and I see no problem with that as long as we’re keeping the size/amount down.

1 Like

So I work with AWS every day (but not an employee), and hold all 9 currently available certifications, so perhaps I fall into the “geekiest of geeks” hoard mentioned above.

I think the benefit with AWS comes if you need more capabilities. I don’t know if Netlify offers a CDN option, but AWS offers CloudFront. They also have JavaScript APIs that can be used from the browser to build integration with all the other services they offer. Want a database? Talk directly to DynamoDB. Want to provide user logins? Integrate with Cognito. Want to build a web application? Use API Gateway and Lambda. Need really high availability, use S3 bucket replication to another AWS region that can act as a failover.

Of course, if you just need static hosting, none of this is relevant, but if you need other options, chances are you’ll get them with AWS.

3 Likes

I’m not quite understanding how Netlify has a 100GB storage limit but it pulls from (typically) GitHub which has a 1GB limit? Does the limit on Netlify mean whatever webpage it is storing on it’s CDN? So if you had your code on Github, and it synced and built that, but your images or media were on cloudinary or something else, it would have to pull that down too? or wouldn’t cloudinary be the place where the assests were accessed on page load? I guess I’m not seeing how you could ever transfer 100GB to Netlify… let alone do it from a GitHub linked account.

I’m biased towards “easy” services…

My site, The Free Bundle, is now an online magazine, but before that it was the first “get free games in a bundle” websites. My shared hosting said we had an “unlimited” plan, all included.

Then one day we showed on the first page of Reddit, CNET wrote an article about us and Amazon contacted me because they were tweeting our website.

Next thing I knew the site was down.

Turns out unlimited it is not REALLY unlimited.

One thing its storage, another entirely different its RAM. You can have all the storage in the world, but if you are limited by RAM… then people won’t bee able to reach your website when they need to.

Now, I asked Netlify how much RAM their free package supported.

They simply evaded the question.

With AWS I know exactly what I get (a headache, but… I get what I pay for).

P.S: I have a budget limit of $400 with AWS. If my monthly budget goes beyond that point (which is extremely high for a server) alerts are fired to my emails, phones and even land lines on my office building. I made a panic command, which is the equivalent of throwing a Molotov to my AWS until I can reach a desk. It won’t stop visits but at least will slow things down until I can reach the Batcave ™

For this end, I wrote myself a handy “DON’T PANIC” booklet that sit safely in my desk drawers (office and home). There, I have everything I need to take down my AWS servers. I rather not to trust my brain at that point and simply follow procedures, NASA style.

Because that’s not relevant for a static site. Netlify doesn’t show their RAM the same way Amazon’s S3 doesn’t show their RAM limits. Netlify isn’t the equivalent of all of AWS, and it’s not really a shared webhost.

It is relevant if you are getting hit by Reddit, Amazon, CNET and many others at the same time. If you are talking about a low-level traffic site, then no, it is not relevant.

But if you happen to have the luck of getting thousands over thousands of concurrent visits, then you will have a problem. Static or not (I presume, since my site at the time was static, pure HTML files).

Amazon escalates, then bills you.

1 Like

You can get hit by Reddit, Amazon, and CNET on Netlify and be fine. It’s not the same as typical shared hosting. But if you truly believe I was wrong, why not correct me on the RAM limits of Amazon S3? You can consider Netlify to have “unlimited” RAM because its limits are elsewhere. Unlike shared hosting, there’s no one specific server that your site is on.

You can get hit by Reddit, Amazon, and CNET on Netlify and be fine.

No it won’t.

It’s not the same as typical shared hosting.

No, its not. Its worse, they don’t disclosure their RAM limits. At least the most crappy web host out there gives you that information. Netlify doesn’t gives out this information, not even for their paid plan.

Of course, anyone’s free to try their service, though. All I can say is that I won’t be falling into that trap twice.

If it wasn’t for good-hearted people that lend me a hand when my site went “viral” for a couple of days, all of the sudden, RAM mattered for a static website… well, things would have gone a lot worse if it wasn’t for those people.

I’m simply trying to share my personal experience with these “free” services. No idea if Netlify has 4GB of RAM allocated for concurrent traffic for each free account.

Maybe they have. Unlikely to see a pig fly, but, you know. Perhaps.

All I say is… caution, people. Caution with your projects, above all.

One invests a great deal of time and money into a web project. Hosting providers should be also part of that investment. There ain’t no such thing as a free lunch.

Why not correct me on the RAM limits of Amazon S3

Amazon S3 is pretty much the same as Lightsail (which came out later, based on their experience on how people were using S3 buckets to host static sites), you can find the RAM limits for Lightsail in this page: VPS, web hosting pricing—Amazon Lightsail—Amazon Web Services

If you need more information about Amazon S3 limits, I recommend you to go through their documentation page.

Netlify works for many and is great if requirements match the functionality they provide.

I am not sure how many requests are generated when “going viral”, but S3 especially in conjunction with CloudFront seems to support some impressive rates. Additionally, other features are available to enhance, protect and secure a site as BigBadBaz stated.

https://docs.aws.amazon.com/AmazonS3/latest/dev/request-rate-perf-considerations.html

Amazon S3 automatically scales to high request rates. For example, your application can achieve at least 3,500 PUT/POST/DELETE and 5,500 GET requests per second per prefix in a bucket. There are no limits to the number of prefixes in a bucket. It is simple to increase your read or write performance exponentially. For example, if you create 10 prefixes in an Amazon S3 bucket to parallelize reads, you could scale your read performance to 55,000 read requests per second.

1 Like