hachyderm.io is one of the many independent Mastodon servers you can use to participate in the fediverse.
Hachyderm is a safe space, LGBTQIA+ and BLM, primarily comprised of tech industry professionals world wide. Note that many non-user account types have restrictions - please see our About page.

Administered by:

Server stats:

9.4K
active users

Over the years, I made a handful of maps of various things in Cambridge; I have collected some, but not all of them, on this page about housing things in Cambridge.

This includes things like maps of where you could legally build a fourplex (short answer: not many places!); the distribution of tax paid per parcel (Kendall Square pays a lot!) and more.

crschmidt.net/housing/cambridg

crschmidt.netHousing Explorations in CambridgeHousing-related explorations in Cambridge.

Fun fact: sharing this link on Mastodon caused my server to serve 112,772,802 bytes of data, in 430 requests, over the 60 seconds after I posted it (>7 r/s). Not because humans wanted them, but because of the LinkFetchWorker, which kicks off 1-60 seconds after Mastodon indexes a post (and possibly before it's ever seen by a human).

Every Mastodon instance fetches and stores their own local copy of my 750kb preview image.

(I was inspired by to look by @jwz's post: mastodon.social/@jwz/109411593.)

Mastodonjwz (@jwz@mastodon.social)Mastodon stampede. "Federation" now apparently means "DDoS yourself." Every time I do a new blog post, within a second I have over a thousand simultaneous hits of that URL on my web server from unique IPs. Load goes over 100, and mariadb stops... https://jwz.org/b/yj6w

@crschmidt well, this sounds like a p0 bug. Mastodon is going into robots.txt on many servers once this gets noticed widely.

@cshabsin Don't worry! I just confirmed that Mastodon doesn't respect robots.txt for any of these fetches, so even if it's added to robots.txt, it will have no effect!

@crschmidt @cshabsin They at least identify themselves at the useragent level though so you can filter on the server side

Chris Shabsin

@Snausages @crschmidt Well, that's going to make for a nice user experience when Mastodon servers can no longer fetch previews anywhere because everyone has figured out that Mastodon is a DoS engine...

@Snausages @cshabsin bandwidth is not the only problem. Many webservices are designed to serve their expected usage; even for popular ish blogs, that is more like "one request per hour" than "50 requests per second". Having 973 requests come in simultaneously and then trickling out the response bytes is not solving the problem on the server side.

@Snausages @cshabsin but even so, the vector for abuse here is massive, every reply to George Takei is literally just a distributed network of 1000+ nodes to make requests to any URL you provide! Reply to him 10 times and you've generated 10,000+ requests to your target over 60 seconds... How many servers aren't designed for 133 qps? How about 266 qps?

Also, how long would it take whatever server he is hosted on to respond to an abuse claim is one was reported?

@Snausages and how many admins are going to take this into account when they shut mastodon out? How many are going to be paying enough attention to remove the block once Mastodon announces they've fixed this?