Webmention, like many federated protocols, is indeed vulnerable to some forms of DDOS attacks, although no such practical attack was spotted so far. Only annoying but harmless spam has been spread throughout the indieweb.
I would say the risk is more for the webmention endpoint than for your static website. The endpoints have to parse remote web pages to understand the semantics behind. In this regard, ActivityPub is more appropriate because parsing JSON is waaaaay faster and less error-prone than parsing raw HTML. ⁽¹⁾
But in any case nothing is wrong with the protocol in itself, we just need crypto auth (and encryption) plus moderation built on top. Some people within the indieweb movement are already working on this. We just need more pioneers to get on board and challenge the status quo
Well you could write your own ActivityPub endpoint (it’s really not hard), but if you need a prepackaged solution then there’s fed.brid.gy, a bridgy ActivityPub endpoint that will turn ActivityPub + AS 2.0 → Webmention + Microformats 2.0
Wow, that sounds nice. How does that work? How can you guess the Mastodon API address (i.e. the instance) when you generate your site, if it changes with every user? Do you have to use some Javascript vodoo to let users enter their instance address, and from there update the form action?
On a slightly different topic, i had started writing some content plugin system for my build script. I’m currently thinking on refining it and working on proper webmention.io integration. Are other people interested in such pre-packaged solutions? Is it worth spending my time working on?
⁽¹⁾ How a webmention endpoint works : it receives a request saying “page A linked to page B”. From there, it will try to load page A and B, ensure it’s responsible for page B’s webmentions and that the endpoint that sent the request is indeed page A’s webmention endpoint. This requires parsing both HTML pages to find rel links. Then, if those links are correctly setup (i.e. we’re not receiving webmention for a website we don’t manage, or from a fishy website), we can parse page A’s body to interpret the interaction that took place on that page. So that’s a lot of parsing and guessing for every interaction, which makes it more vulnerable to DOS/DDOS.