Anyone for Webmention?


Curious as to whether anyone has integrated Webmentions into their Hugo site. There is mention of one Hugo website but not any detailed walk-through that I can find. Heck, there’s even a WordPress plugin for it! Also noticed, Perch CMS have implemented it as well.

I’ll be doing it anyway, will write up a blog post when i do :slight_smile:


I am interested, looking forward to your blog post.


If you’ve gone indie enough to log in with your site, one of the very easiest ways is via :slight_smile:


Interesting, I’m in the process of setting up the “indie logging in with own site”, looking forward to checking out – thanks for the heads up :slightly_smiling_face:


Were you able to make progress with Webmentions integration? I added the rel="me" links to my site head, and am also trying to put the pieces together of what’s written at… but am still confused.

Would like to figure out how to replace Disqus completely with Webmentions.


Hey, i have not implemented it in any hugo sites before, but i’ve dealt with indieweb (<3) before so i’ll be happy to help if you have questions. @kaushalmodi, i ended up writing a sort of super brief/dense tutorial on basic indieweb stuff, because i figured you wouldn’t be the only one in the community struggling to find their way around so many wikis and standards. It’s really just a draft. What do you think? :slight_smile:

So if you have your rel=me links in place, you are setup to use indieauth, i.e. trust external web services to authenticate you elsewhere. ⁽¹⁾ That is not what webmention is about: this is the identity management side of the indieweb. This will be useful, for instance, to authenticate you on the service.

So if you intend to use webmentions as a replacement for Disqus (amazing quest!), you need to:

  • add semantics to your templates (microformats2)
  • use/write an endpoint to interact with other websites (webmention)
  • integrate this external content in your site (two methods)

So to integrate the data from your webmention endpoint into your website, there are two possibilities:

  • update your templates to fetch data from this endpoint (using getJSON for instance)
  • let the endpoint or an intermediary script directly write to your website data folder

Add semantics to your templates

Interactions on the indieweb are interpreted by parsing class fields in the page. So you need to add semantic classes throughout your templates to let other websites understand your content. That allows someone to share an article of yours on their site, or to ‘like’ it, for instance. It also allows them to publish an article or a note in reply to your article by letting a link point to your URL with the class u-in-reply-to.

For instance, I could just publish on my site something along the lines of:

<article class="h-entry">
    <p>In reply to <a href="">@kaushalmodi</a>'s <a href="" class="u-in-reply-to">Hugo: Leaf and Branch Bundles</a></p>
    <p class="e-content p-summary">Thanks for the recap on resource bundles! I feel enlightened :)</p>

That would be interpreted as a comment on your website. Your webmention endpoint processing this reply would then look up my h-card to establish a profile on me, that could be then displayed on your website alongside my comment.

There’s many examples of how to use microformats2 on the microformats wiki. You should take a look :slight_smile: The basics of it is:

A collection of content is called should have class h-feed. An individual content is either a h-entry or h-event, h-card (for persons). Any of these content may have a p-name class around its title and a u-url class on its permalink. Typically only articles (blog posts) have a p-name, a simple note doesn’t. An image associated with a content is u-image and the date is dt-published.


A webmention is basically a web ping. It informs a website that one of its URLs was mentioned by another URL. A webmention typically looks like this (taken from W3C recommendation):

POST /webmention-endpoint HTTP/1.1
Host: aaronpk.example
Content-Type: application/x-www-form-urlencoded


HTTP/1.1 202 Accepted

Upon receiving a webmention, your website should parse the remote URL and figure out from its microformats2 markup what the interaction means.


Your webmention endpoint is usually declared in your to indicate to other websites how to let you know about stuff:

<link href="http://aaronpk.example/webmention-endpoint" rel="webmention" />

So if to get started you would like to use the endpoint, just login on there using indieauth (authentication through one your rel=me accounts), and then add the following snippet to your head:

    <link rel="pingback" href="" />
    <link rel="webmention" href="" />

From here on websites trying to contact you will talk to So then we’ll need to integrate all this data we’ll receive with hugo.

Hugo integration

So now, we need is to add to our leaf and branch templates something to fetch the interactions (webmentions) to the content we’re building. Let’s go for the simplest for the moment, and simply fetch this data using {{ getJSON (printf "" .Permalink | urlize) }}

You can however request it by type (like, comment, reshare…). If you’re parsing some content and you’re not sure what kind of content it is, just take a look at post-type discovery.

So from there you just need to do some basic templating with the JSON fetched from A comment should be a h-cite within your h-entry. And you’re basically all done on the receiving end! :smiley:

So why would we want to use an external script and go through the data folder to handle this? Because in this case here we don’t have any moderation, while a script exporting the webmentions (fetched from to your data folder through git would let you do basic moderation within Github/Gitlab using pull requests.


Sorry, that was a bit long (and it’s not over just yet). I hope it helps. Don’t hesitate to point out unclear parts. Maybe I’ll talk about sending webmentions next time!

Also, please note all this indieweb stuff might slowly be deprecated by the emergence of ActivityPub, the new federation protocol for the social web that’s basically a PubSub infrastructure built on the web with ActivityStreams2.0 as JSON vocabulary to represent the interactions. ActivityPub is already supported by forward-thinking platforms like Mastodon and Peertube (and many more coming).

So if you need to structure your content with a section by type (eg /note/ /like/ /article/), you should consider using ActivityStreams2.0 vocab because it’s more complete than microformats2.

⁽¹⁾ So if you wouldn’t like some service to authenticate you, you shouldn’t add the rel=me link to your profile on that site (that’s imo the main weakness of the indieauth spec).


First of all, Wow! Thank you so much! I didn’t expect such an amazing answer, and it’s great! I will start working on the missing bits, and get back with more questions.

Why don’t you blog this somewhere, so that I can sent you a webmention? :slight_smile:


Yes, I understood that part.

Exactly! It’s a bit ironic that to get started with IndieAuth, you need? to have Twitter or Github or etc. “silo” account?

Your post brings a bit of clarity to these things; so thanks! Umm, what’s an “endpoint”? I don’t have any webdev background other than Hugo templating.

Is an endpoint? I have already signed up there.

Hehe, good example :smiley:

That’s helpful. I might end up with memorable param values that translate to those classes.

I don’t yet have the microformats2 classes set up… But I was still able to receive a test webmention from someone on the And then I linked to my twitter account, and that start flooding the likes over there as mentions.

So the importance of the h-* classes is still not 100% clear.

Yes, done. That part was easy.

Now this is what I need to start working on.

… services like dump all mentions from Twitter and other places into… so all of that should get collected for each post separately… correct? Then there would be just one place (the blog page) where you can see all the interaction.

I did not understand this part… What is the relation between that script (btw, which script? Do you have an example?) and Github/Gitlab PR?

Are you kidding? This was awesome!

It would be awesome to know an automated way/script that sends webmentions each time a new post is made. Also what it is not clear if duplicate webmentions can be sent by mistake… what if I edited a post 5 times, and each commit caused the same webmention to be resent? If so, how to prevent that?

For now, I learned of a manual means to send a mention:

Really?! :frowning:

I hope all this effort doesn’t go to complete waste…

I am not sure what I should consider… I will look into Webmention vs ActivityStreams2.0 vocabulary… This is the very first time I am hearing about that standard.

PS: My quest into Webmention got invigorated because a kind gentleman Christian Weiske replied with this :slight_smile:



Because I don’t have a blog anymore, I’m too busy writing a JS-free theme from scratch (docs outdated) for the squat I live in − and future militant websites − to worry about this at the moment :slight_smile:

I just didn’t have time yet to implement it in the theme, but as you can see I’ve already thought it through. That’s the main reason I’m advertising for configurable cache TTL which nobody seems to care about ^^

Well you can do it with email (with or without PGP signature), and we could implement XMPP auth as well (super easy). But I find it weird indeed that there is no difference semantically between “this URL is also me” and “I trust this URL to authenticate me”.

An endpoint is a generic term to talk about a webpage processing requests. A webmention endpoint, like, will store for each of your pages, the pages that pinged them, and usually try to fetch information from them.

For instance, when it receives a request, will fetch the source URL (the page that pinged you) and parse its microformats to be able to serve you a JSON like this:

  "links": [
      "source": "",
      "verified": true,
      "verified_date": "2013-04-25T17:09:33-07:00",
      "id": 900,
      "data": {
        "author": {
          "name": "Tantek Çelik",
          "url": "",
          "photo": ""
        "name": "Another milestone: @eschnou automatically shows #indieweb comments with h-entry sent via pingback",
        "content": "Another milestone: <a class=\"auto-link h-x-username\" href=\"https:\/\/\/eschnou\">@eschnou<\/a> automatically shows #indieweb comments with h-entry sent via pingback <a class=\"auto-link\" href=\"http:\/\/\/entry\/testing-indieweb-federation-with-waterpigscouk-aaronpareckicom-and--62-24908.html\">http:\/\/\/entry\/testing-indieweb-federation-with-waterpigscouk-aaronpareckicom-and--62-24908.html<\/a>",
        "published": "2013-04-22T15:03:00-07:00",
        "published_ts": 1366668180,
        "url": ""

This data was not contained in the webmention (just a ping of one URL to another). It was parsed on the receiving end from tantek’s website following the post-type discovery algorithm (see webmention spec).

That would be useful, at least for the in-reply-to field. Apart from this, p-name is your title, dt-published is your date (timestamp format), etc… so I think the rest of those “classes” is already in your frontmatter. But then, if you want to publish different contents than merely articles, you will probably need a new archetype (eg. for events/rsvps).

The microformats2 will allow other websites to read your website. Here you could receive a webmention, but try publishing a reply to someone (marked with u-in-reply-to). It will be considered a simple ping (like a pingback) because the remote website will be unable to understand where the post title/content is, what the associated image is, find your profile pic, etc…

So you could receive webmentions, in this case from services who do not speak microformats2. That’s because you’re going through which translates social interactions from silos (eg. Twitter) to standard microformats2 + webmention. So then can understand what this interaction is about.

Yes, all webmention-based interactions are on a per-URL basis. Usually, the place gathering all interactions is your webmention endpoint, there’s no need to display such a long list on your website.Your homepage may itself display its own webmentions, but usually we don’t because overtime it can become a long list of people linking to you :slight_smile:

Sorry, I wasn’t clear. I was talking about a script to replace the direct integration via getJSON, or a script to replace at all (that is, become your webmention endpoint). Because if you integrate directly stuff from, there is no moderation layer. Any website talking webmention can ping your URL will adverts for viagra and end up on your website!

Also, there’s some data you may want to store locally from this webmention, like the person’s profile picture, so that you don’t have to serve all profile pics from different websites (which would allow for instance Twitter to track all your visitors).

So you really want an intermediary script somewhere, probably running on cron or on a git hook, that will fetch data, and create a PR per interaction to your repo with the JSON payload (in the data folder) and the person’s pic if we don’t have it yet (in the static folder).

Well you should do it in a git hook, that is a script running when your repo has been committed. From there, just detect changes to the content folder and send appropriate webmentions for new/edited content.

You may send webmention even if your content hasn’t changed, but that’s basically DDOSing people :stuck_out_tongue: so please only do it when your content has actually changed ^^

That’s not easier than sending a normal webmention. See the specs for what a webmention should look like. It’s litterally a HTTP request with two URLs inside.

It’s not going to waste. microformats2 and webmention are always useful dealing with people who don’t speak ActivityStreams/ActivityPub. But there’s only thousands of people speaking microformats2 for at least a million Mastodon users speaking ActivityPub without realizing.

What I meant when I say you should order your content following ActivityStreams vocab, is its vocabulary is more precise and standardized than microformats2, so it will be easier to use it as a structure for your content, and then output microformats2 with outputFormat HTML, and clean AS2 with activitystreams outputFormat (the other way around would be more complicated imo).

Hehe, that’s a happy encounter then :slight_smile:

I hope I could make some stuff more clear now?


Not yet I’m afraid, ironing out the final few bugs in Indiego, the theme that will use webmentions when I wrap my head around it :slightly_smiling_face:

For a GitHub solution, have a look at Staticman – haven’t used it yet but sounds promising.


wow +1 :smile:

thanks for all this info, will digest at leisure


Thanks for that tip… I got a rough draft working hereDemo (scroll to the very bottom).

Though the json received webmentions are lot less than what I see on…


Not sure if related, but you may want to put quotation marks around your HTTP attributes :slight_smile:

<link href="" rel="me">
<link href="" rel=me>
<link rel="pingback" href="">
<link rel="webmention" href="">
<link rel="authorization_endpoint" href="">

Also, in order to work properly, needs to have linked accounts registered as rel=me on your homepage. So if you want to gather feedback from more networks than Twitter and Github, you’ll need to add them there :slight_smile:


I had the double quotes originally, but looks like the HTML minification step removes them where they are OK to not be present. Example: Attributes whose values don’t have spaces probably don’t need to be quoted.

Correct. But as you quoted, I already have those there.

Well, those are the only 2 networks that make sense to me to associate with that blog. :slight_smile: I see that sent a lot more webmentions than I derive from the JSON object received from A very easy way to compare this is comparing the number of likes; they are not the same.

I’ll open an issue on the repo to understand this better.


Are you sure it’s not a caching issue? Have you tried calling the API from some other place than Hugo? (for example from curl)

Because if you don’t run hugo with the --ignoreCache param, it will cache remote content “forever” (unless it’s stored in a tmpfs you clean regularly). So you could get a first batch of webmentions from the API, but this list would actually never be refetched from for updates :slight_smile:

That’s typically the kind of situation that got me wondering about configurable cache TTL.


I can totally understand that. I couldn’t start writing stuff on my blog before I had my theme in a presentable state as per my liking AND had the whole blogging flow set up as I needed it work through Emacs Org mode.

TIL, thanks.

That’s a good summary, I will include that in a blog post about intro to indiewebbing (there are already many such posts by others, but why not…) :slight_smile:

Now that I have added the microformats2 classes to my template, it makes more sense.

For starting steps, the direct integration with seems to work fine. I will have to wait for someone to set up and explain the webmention moderation flow for static sites. But yes, the spam moderation will be a concern as webmentions becomes more widespread.

That’s another unknown for me… exploring that for some other day maybe…

Yes (as I recently learned). I just add a “Send Webmention” form to my posts.

Your help was very critical in helping me indiewebifying my site. Thank you! I believe I am almost there* in getting rid of Disqus.

Next steps:

  • Refine the webmentions integration
  • Download existing Disqus comments, convert that to a JSON format and present them somehow under the posts. Done!


No, it was something else… API sends a JSON object with only 20 links by default… I just had to increase that count:

{{ $domain := "" }} <!-- Hard-code the domain during testing on localhost, branches -->
{{ $num_mentions_max := 200 }}
{{ $webmentions_rcv := getJSON (printf "" $domain .RelPermalink $num_mentions_max) }}


That --ignoreCache tip works for me when running hugo server locally. Thanks. Though, yes, it would be nice to get your configurable cache TTL feature baked in.


They have this limit for a reason.

I read quite a bit about Webmentions since you mentioned it in the Disqus thread and I think this thread is a better place to reply.

I really liked the concept of using Webmentions to render comments and likes from various sources in a Hugo site.

But what stopped me from reading further is the DDOS attack vector:

And from what I’ve seen in this link the measures that one needs to take in order to prevent such an attack are quite complex.

@cmal @kaushalmodi What sort of measures are you taking to prevent abuse? Care to share?


That just limits the size of the JSON object returned by the API. I believe it’s 20 because webmentions are commonly displayed using Javascript, with N number of mentions per page. As I am creating a static page of Webmentions, I don’t need to paginate stuff… I just get every WM out there for that post.

Hmm. I haven’t yet thought much about that. I believe, I will have to deal with that when that happens. But your concern is legit, if you are thinking about implementing this for a commercial site. In my case, it’s a personal blog, which receives a handful of comments per month. If ever DDoS becomes an issue, I might need to just filter those out from the JSON object returned by getJSON.

In summary, the Webmentions commenting approach is much much nicer than using Disqus for me :slight_smile:


Fair enough. I’m not a SysAdmin or an InfoSec expert but this much know: never play with things that have potential attack vectors.

Anyway I’ve been looking at your site’s source and how you implemented Webmention. Very clever approach. :+1:

But I was particularly intrigued with how you made those Twitter interaction buttons @kaushalmodi . It got me thinking and I’m going to implement something similar on my blog but with Mastodon (that I joined today).

I’ve been looking into the Mastodon API and it’s amazing! Did you know that you can make a POST request to favorite something on Mastodon through your Hugo site? Also you should be able to fetch Mastodon replies pretty much like you do through Webmention for the Twitter ones.

True Mastodon is small compared to Twitter etc. But… It feels a bit like the web used to be in the late 90s and early 00s and I really like it.


It’s not clear though if any kind of attack is easily possible on my site … it’s static… minimal attack surface than even most of the other static sites. I use and enforce https, have a pretty strict Content Security Policy (no inline scripts allowed… so even if someone injects inline scripts, they won’t run), disallow frames, … and a lot more (search of Security Headers).

So the worst case “attack” that I can forsee because of Webmentions is a static page with 1000’s of spam Webmentions (which can be easily taken care of).

Thanks :blush:! It was fun to implement something unique.

Welcome to Mastodon! It has a unique kind of crowd, and it’s a breath of fresh air… no ads! The only reason I haven’t integrated comments directly to Mastodon is that the Mastodon dev recently refused integrating auto-sending of Webmentions, and also there’s no mechanism to back-feed Webmentions from Mastodon (as there are so many instances out there). So discussions in Mastodons will stay stuck there… and never show up in the Webmentions feed below each post.

With the case of Twitter, discussions happening on the twitter thread get back-fed as Webmentions via a service called and so they show up below the respective post.

Once there’s a good Webmentions integration with Mastodon, I am switching to that as a backup interaction method for folks unfamiliar with sending Webmentions.