30
One of the admins at lemmy.blahaj.zone asked us to purge a community and all of
its users because they thought it was full of child sexual abuse material, aka
CSAM, fka kiddy porn. We assured them that we had checked this comm thoroughly
and we were satisfied that all of the models on it were of age. The admin then
demanded we purge the comm because they mistook it for CSAM, and claimed that
the entire point of the community was to make people think it was CSAM. We
vehemently disagreed that that was in fact the point of the community, but they
decided to defederate from us anyway. That is of course their choice, but we
will not purge our communities or users because someone else makes a mistake of
fact, and then lays the responsibility for their mistake at our feet. If someone
made a community intended to fool people into thinking it was kiddy porn, that
would be a real problem. If someone of age goes online and pretends – not
roleplays, but pretends with intent to deceive – to be a child and makes porn,
that is a real problem. Nobody here is doing that. One of the reasons we run our
instance the way that we do is that we want it to be inclusive. We don’t body
shame, and we believe that all adults have a right to sexual expression. That
means no adult on our instance is too thin, fat, bald, masculine, old, young,
cis, gay, etc., to be sexy, and that includes adults that look younger than some
people think they should. Everyone has a right to lust and to be lusted after.
There’s no way to draw a line that says “you can’t like adult people that look
like X” without crossing a line that we will not cross. EDIT: OK, closing this
post to new comments. Everything that needs saying has been said. Link to my
convo with the blahaj admin here [https://lemmynsfw.com/comment/683520].
First of all I want to make it clear that I don’t agree with this defederation, if the models are verified adults then there is no problem.
That said, as a Mastodon instance admin, I wanna explain something to y’all. CSAM is one of those things that you do not want to take your chances with as an admin. Beyond the obvious fact that it’s vile, even having that shit cached on your server can potentially lead to very serious legal trouble. I can see how an admin might choose to defederate because even if right now all models are verified, what if something slips through the cracks (pun not intended, but I’ll roll with it).
My instance defederates a bunch of Japanese artist instances like pawoo because of this. All it takes is one user crossing the line, one AI generated image that looks too real.
Aside from all that, there’s also a lot of pressure being put on many instance admins to outright ban users and defederate instances that post or allow loli/shota artwork as well. You’re quickly labeled a pedophile if you don’t do it. A lot of people consider fake CSAM to be just as bad, so it’s possible that the other admin felt that way.
I’m more lenient on loli/shota as long as it’s not realistic because I understand that it’s a cultural difference and generally speaking Japanese people don’t see it the way we do. I don’t ban stuff just because I think it’s gross, I just don’t look at it.
Anyway what I’m trying to say I guess is that being an admin is hard and there’s a lot of stuff y’all don’t know about so disagree with that person if you want (I do too) but keep in mind that these decisions don’t come easy and nobody likes to defederate.
EDIT: here’s a mastodon thread about the CSAM problem in the fediverse if you’d like to learn more.
If they offered that as explanation there would have been no drama.
Well yeah I’m not like defending them or anything. I just kind of understand where they’re coming from too.
Yeah, but on the other hand it is verifiably not CSAM
The problem is that if it’s hard to tell at a glance, there’s no way to know if actual CSAM gets uploaded there in the future. So what it boils down to is, is it worth the risk? That admin says no, it isn’t, so they defederate.
My Mastodon instance defederates pretty much any instance that allows sexually explicit or suggestive artwork or photos of people who look underage. It’s just not worth it.
then why even federate at all? someone else could post CSAM at any time
I can’t tell if you’re trying to be funny or not but I’ll answer anyway.
There’s a difference between federating with instances that disallow any pornography featuring models/characters/etc who look underage, and federating with instances that allow that type of material. Actual CSAM will be immediately obvious and handled very quickly on the first, but not necessarily on the latter instance.
It’s pretty much standard practice for Mastodon/*key/whatever admins to defederate instances that allow lolicon/shotacon and anything else like that. There are curated block lists out there and everything, we’ve been doing it for years while still federating and we’re doing just fine.
You don’t host it like mastodon would AFAIK, nor do I agree with so called standard practice.
What do you mean I don’t host it like mastodon would? Are you talking about the “official” mastodon instance, mastodon.social? Because they do exactly what I described above, and you’d have known if you’d just checked https://mastodon.social/about before posting. Just click “moderated servers”. See all the “inappropriate content” that’s listed there? Most of those allow lolicon/shotacon (as evidenced by their defederation of pawoo). It is standard practice, and whether or not you agree is completely irrelevant. That’s how most instances are run, period, except for the free speech absolutists who are also defederated by everyone else for allowing bigotry/propaganda/lolicon and so on.
I know that there’s a tradition of confidently posting on reddit about something you don’t know anything about and acting like your opinion on the subject you know nothing about is Very Important, but this ain’t reddit.