Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If I as a forum administrator delete posts by obvious spambots, am I making an editorial judgment that makes me legally liable for every single post I don’t delete?

If my forum has a narrow scope (say, 4×4 offroading), and I delete a post that’s obviously by a human but is seriously off‐topic (say, U.S. politics), does that make me legally liable for every single post I don’t delete?

What are the limits here, for those of us who unlike silicon valley corporations, don’t have massive legal teams?



> If my forum has a narrow scope (say, 4×4 offroading), and I delete a post that’s obviously by a human but is seriously off‐topic (say, U.S. politics), does that make me legally liable for every single post I don’t delete?

No.

From the court of appeals [1], "We reach this conclusion specifically because TikTok’s promotion of a Blackout Challenge video on Nylah’s FYP was not contingent upon any specific user input. Had Nylah viewed a Blackout Challenge video through TikTok’s search function, rather than through her FYP, then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content."

So, given (an assumption) that users on your forum choose some kind of "4x4 Topic" they're intending to navigate a repository of third-party content. If you curate that repository it's still a collection of third-party content and not your own speech.

Now, if you were to have a landing page that showed "featured content" then that seems like you could get into trouble. Although one wonders what the difference is between navigating to a "4x4 Topic" or "Featured Content" since it's both a user-action.

[1]: https://fingfx.thomsonreuters.com/gfx/legaldocs/mopaqabzypa/...


>then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content."

"may"

Basically until the next court case when someone learns that search is an algorithm too, and asks why the first result wasn't a warning.

The real truth is, if this is allowed to stand, it will be selectively enforced at best. If it's low enough volume it'll just become a price of doing business, sometimes a judge has it out for you and you have to pay a fine, you just have to work it into the budget. Fine for big companies, game ender for small ones.


> Now, if you were to have a landing page that showed "featured content" then that seems like you could get into trouble. Although one wonders what the difference is between navigating to a "4x4 Topic" or "Featured Content" since it's both a user-action.

Consider HackerNews's functionality of flamewar suppression. https://news.ycombinator.com/item?id=39231821

And this is the difference between https://news.ycombinator.com/news and https://news.ycombinator.com/newest (with showdead enabled).


> If my forum has a narrow scope (say, 4×4 offroading), and I delete a post that’s obviously by a human but is seriously off‐topic (say, U.S. politics), does that make me legally liable for every single post I don’t delete?

According to the article, probably not:

> A platform is not liable for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”

"Otherwise objectionable" looks like a catch-all phrase to allow content moderation generally, but I could be misreading it here.


I'm guessing you're not a lawyer, and I'm not either, so there might be some details that are not obvious about it, but the regulation draws the line at allowing you to do[1]:

> any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected

I think that allows your use case without liability.

[1] https://www.law.cornell.edu/uscode/text/47/230


That subsection of 230 is about protecting you from being sued for moderating, like being sued by the people who posted the content you took down.

The "my moderation makes me liable for everything I don't moderate" problem, that's what's addressed by the preceding section, the core of the law and the part that's most often at issue, which says that you can't be treated as publisher/speaker of anyone else's content.


Wow, "or otherwise objectionable" would seemingly give providers a loophole wide enough to drive a truck through.


It's not a loophole. That's the intended meaning, otherwise it would be a violation of freedom of association.

That doesn't mean anyone is free to promote content without liability, just that moderating by deleting content doesn't make it an "expressive product."


Both are protected, because both are 1A activity.


What the other replies are not quite getting is that there can be other kinds of moderator actions that aren't acting on posts that are offtopic or offensive, but that do not meet the bar for the forum in question — are they considered out of scope with this ruling?

As an example, suppose on a HN thread about the Coq theorem prover, someone starts a discussion about the name, and it's highly upvoted but the moderators downrank that post manually to stimulate more productive discussions. Is this considered curation, and can this be no longer done given this ruling?

It seems to me that this is indeed the case, but in case I'm mistaken I'd love to know.


Wouldn't it more be you are responsible for pinned posts at the top of thread lists? If you pin a thread promoting an unsafe onroad product, say telling people they should be replacing their steering with heim joints that aren't street legal, you could be liable. Whereas if you just left the thread among all the others you aren't. (Especially if the heim joints are sold by a forum sponsor or the forum has a special 'discount' code for the vendor).


Let me ask you a question in return.

If you discovered a thread on the forum where a bunch of users were excitedly talking about doing something incredibly dangerous in their 4x4s, like getting high and trying some dangerous maneuver, would you let sit on your forum?

How would you feel if somebody read about it on your forum and died trying to do it?

Update: The point I'm trying to make is that _I_ wouldn't let this sit on my forum, so I don't think its unethical to ask others to remove it from their forums as well.


Not the OP, but if I thought we were all joking around, and it was the type of forum that allowed people to be a bit silly, I would let it stand. Or if I thought people on the forum would point out the danger and hopefully dissuade the poster and/or others from engaging in that behavior, I would let it stand.

However, if my hypothetical forum received a persistent flood of posts designed to soften people up to dangerous behaviors, I'd be pretty liberal removing posts that smelled funny until the responsible clique moved elsewhere.


I think you're looking for the kind of precision that just doesn't exist in the legal system. It will almost certainly hinge on intent and the extent to which your actions actually stifle legitimate speech.

I imagine that getting rid of spam wouldn't meet the bar, and neither would enforcing that conversations are on-topic. But if you're removing and demoting posts because they express views you disagree with, you're implicitly endorsing the opinions expressed in the posts you allow to stay up, and therefore are exercising editorial control.

I think the lesson here is: either keep your communities small so that you can comfortably reason about the content that's up there, or don't play the thought police. The only weird aspect of this is that you have courts saying one thing, but then the government breathing down your neck and demanding that you go after misinformation.


A lot of people seem to missing the part where if it ends up in court, you have to argue that what you removed was objectionable on the same level as the other named types of content and there will be a judge you'll need to convince that you didn't re-interpret the law to your benefit. This isn't like arguing on HN or social media, you being "clever" doesn't necessarily protect you from liability or consequences.


You are simply not shielded from liability, I cannot imagine a scenario in which this moderation policy would result in significant liability. I'm sure someobe would be willing to sell you some insurance to that effect. I certainly would.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: