Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t understand how people can be so confident that this will only lead to good things.

First, this seems like courts directly overruling the explicit wishes of Congress. As much as Congress critters complain about DCA Sec230, they can’t agree on any improvements. Judges throwing a wrench at it won’t improve it, they will only cause more uncertainty.

not liking what social media has done to people doesn’t seem like a good reason to potentially destroy the entire corpus of videos created on YouTube.



Congress did not anticipate the type of algorithmic curation that the modern internet is built on. At the time, if you were to hire someone to create a daily list of suggested reading, that list would not be subject to 230 protections. However, with the rise of algorithmic media, that is precisely what modern social media companies have been doing.


I can make a decent argument that even offering a basic “sort by” dropdown, where the platform sets a _default_ sort, classifies it as an “algorithm”.

I’m arguing that judges shouldn’t be tearing down the existing legal regime without someone actively planning for a replacement that doesn’t have massive _unintended consequences_ (which all of the proposed Sec 230 reforms have)

And the fact that you only mentioned “modern social media companies” means that you are also underestimating which offerings qualify. Sec 230 protects all websites and apps that show _any_ user content to other users, not just the large social media companies. Think review sites, online classifieds, your group chat in a Messenger app, blog comments, social recipe sites, shared bookmarking sites, the “memo” field of transactions in every blockchain, etc.

And the obvious worry I have is that new jurisprudence starts pulling Jenga pieces away from AI chatbots before Congress even decides whether those qualify as a platform, a publisher, or something completely different.


Congress has had ample opportunity to pass changes since the rise of algorithmic feeds too though and they haven't done so.


No, 230 is not overturned.

The original video is still the original poster's comment, and thus still 230 protected. If the kid searched specifically for the video and found it, TikTok would have been safe.

However, TikTok's decision to show the video to the child is TikTok's speech, and TikTok is liable for that decision.

https://news.ycombinator.com/item?id=41392710


If the child hears the term "blackout" and searches for it on TikTok and reaches the same video, is that TikTok's speech - fault - as well? TikTok used an algorithm to sort search results, after all.


I think the third sentence of the comment you’re replying to answers that


So you believe that presenting the results (especially if you filter on something like 'relevance') of a search now makes the website liable?

That's going to be hell for Google. Well, maybe not, they have many and decent lawyers.


I’m not sure you read the sentence in question correctly


> However, TikTok's decision to show the video to the child is TikTok's speech, and TikTok is liable for that decision.

How is my interpretation incorrect, please? TikTok (or any other website like Google) can show a video to a child in any number of ways - all of which could be considered to be their speech.


The third sentence is "If the kid searched specifically for the video and found it, TikTok would have been safe."


Aah, I counted paragraphs - repeatedly - for some reason. That's my bad.

That said, this is a statement completely unsubstantiated in the original post or in the post that it links to, or the decision in TFA. It's the poster's opinion stated as if it were a fact or a part of the Judge's ruling.


You're right, I did jump to that conclusion. It turns out it was the correct conclusion, although I definitely shouldn't have said it.

https://news.ycombinator.com/item?id=41394465


From page 11 of the decision:

"We reach this conclusion specifically because TikTok’s promotion of a Blackout Challenge video on Nylah’s FYP was not contingent upon any specific user input. Had Nylah viewed a Blackout Challenge video through TikTok’s search function, rather than through her FYP, then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content."


Well, if we consider the various social media sites:

Meta - Helped facilitate multiple ethnic cleansings.

Twitter - Now a site run by white supremacists for white supremacists.

Youtube - Provides platforms to Matt Walsh, Ben Shapiro and a whole constellation of conspiracy theorist nonsense.

Reddit - Initially grew its userbase through hosting of softcore CP, one of the biggest pro-ana sites on the web and a myriad of smaller but no less vile subreddits. Even if they try to put on a respectable mask now its still a cesspit.

Linkedin - Somehow has the least well adjusted userbase of them all, its destruction would do its users a kindness.

My opinion of social media goes far and beyond what anyone could consider "not liking".

In any case, it would mean that those videos would have to be self hosted and published, we'd see an en masse return of websites like college humor and cracked and the like, albeit without the comments switched on.


Baby. Bath water.

You are arguing that mega social media companies should not be immune from liability. I could care less about those companies, but I do care about the unintended consequences of this ruling.

The mega social media companies weren’t immune from liability before the ruling. But they are mega corporations, with lots of attorneys on retainer who craft the ToS / EULA / other contracts to shed legal risk. Even this ruling isn’t likely to hurt them much in the long run.

The “baby” is every small blog that allows comments, every store / product review, every social bookmarking site, every game with multi-player chat, etc. These are examples of features available because of Sec230 protections. If some enterprising attorney can spin the _default sort order_ into a statistically significant harm to their plaintiff, every single website/app just became a far bigger target for litigation. And even if they can’t, now every mom and pop website will have to pay an attorney to find out if the plaintiff has a case according to this new vague standard.

And this happened because of a judge / jurisprudence, not because of a lawmaking body who solicited feedback from both companies and consumers. This ruling is likely to stand no matter the legal / / social economic fallout.

Craigslist already lost the Personals section to Sec230-modifying legislation. That was a drop in the bucket compared to what we could lose from this ruling.


> In any case, it would mean that those videos would have to be self hosted and published, we'd see an en masse return of websites like college humor and cracked and the like, albeit without the comments switched on.

You seem to be making many assumptions here.

(1) I don’t think this cripples the mega social media companies. They already have thousands of attorneys — they will be busy for a year shedding by liability risk by crafting more onerous ToS that we all end up agreeing to.

(2) Nobody “self hosts” from soup-to-nuts (except for the biggest companies in the world). Your ISP, your DNS provider, your cloud host, etc. all benefit from Sec 230 protections to some extent. We have to wait to see the fallout of those layers.

(3) Companies like College Humor and Cracked benefitted from viral marketing of social networks. If your implied expectation comes true and big social media companies are crippled by this ruling, there will be fewer upstart acts like College Humor and Cracked that grow to become something notable.

(4) Even small companies like College Humor and Cracked won’t be immune from this new redefinition of the line between platform/publisher and speech. My suspicion is this ruling pulled out a Jenga piece, but it will be a while before we see how the tower of internet economics falls.


> Companies like College Humor and Cracked benefitted from viral marketing of social networks.

Oh come off it, writers very explicitly blame Facebook lying about the viral impact of their social network on the collapse of CH and others.

https://twitter.com/adamconover/status/1183209875859333120


YouTube and Facebook were also the original hosts of the Blackout trend videos and pictures, as I recall.


The person you're responding to didn't say they were confident about anything, they said (cynically, it seems to me) that it could lead to the end of many social media sites, and that'd be a good thing in their opinion.

This is a pedantic thing to point out, but I do it because the comment has been downvoted, and the top response to it seems to misunderstand it, so it's possible others did too.


I’m pretty sure I didn’t misunderstand the parent comment.

I just didn’t choose to address only that comment in my reply — I spun my reply in the context of all of the previous discussions about Sec230 reform — because that’s the underlying worry that I have. I could care less if the biggest social media companies die off, but I _do_ worry about all of the other unintended consequences of this ruling (more accurately, changing the definition of “speech” to include something that is so far removed from direct and intentional).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: