[ home ] [ pony / townhall / rp / canterlot / rules ] [ arch ]

/townhall/ - Townhall

A place for civilized animals
Name
Email
Subject
Comment
File
Flags  
Embed
Password (For file deletion.)

[Return][Go to bottom]

 No.13061

File: 1710899238224.jpg (6.41 KB, 275x183, 275:183, Giant_reddit_icon_in_backg….jpg) ImgOps Exif Google

The tech companies Reddit and YouTube must face a lawsuit filed by the survivors of a mass shooting in Buffalo inside of New York State given that the online organizations hosted media that the murder engaged with in order to pick out both the best firearms for the attack and also the best body armor to wear during it.

In general terms, I'm basically a free speech absolutist. However, explicitly giving somebody who says that he or she is going to commit real acts of violence your own best advice to help them do just that, particularly when it comes to something like buying the right pieces of body armor, appears to me personally to be so immoral that it ought to be clearly illegal. Similarly, I would think that somebody giving out tips about filming child pornography and how best to host it online has crossed an ethical line and should also get in trouble.

There's more at: https://www.npr.org/2024/03/19/1239478067/buffalo-shooting-reddit-youtube-lawsuit

Am I making a mistake? Could increased legal scrutiny of those two platforms have negative side-effects? It's almost goes without saying that increased online censorship causes unintended consequences.

 No.13062

>>13061
>But the Buffalo lawsuit sidesteps Section 230 entirely by casting YouTube and Reddit's algorithms as a "defective product."
I don't really buy it.  If a criminal takes a pen and stabs you with it, is the pen a "defective product"?

There is already way too much censorship and unjustified auto-bans online.

I think the decision of the state judge is contrary to federal law and contrary to the First Amendment.

 No.13063

File: 1710919871049.jpg (59.19 KB, 1024x1024, 1:1, 2321456__safe_artist-colon….jpg) ImgOps Exif Google

It's an interesting argument. I've been pouring over the legal filings. Plaintiff is arguing that what we colloquially refer to as "the algorithm" leads to addiction and mental distress in vulnerable people, and more importantly that it can be legally classified as a product under New York State Law and that the use of it causes harm, as known by Google citing multiple public remarks by engineers and presidents.

Google for their part put all their eggs in arguing that they're immune as publishers, but the plaintiff never said that what they published was a problem. Against the claim of YouTube as a product Google basically said "nuh uh" and argued that they can't be negligent because they have no obligations towards their users to be negligent of.

Which are both powerful defenses, but those are complicated factual claims to be argued in court. They won't do for summary dismissal. The plaintiff still has a hard uphill fight and I don't expect them to succeed in any capacity, but they've opened an interesting Pandora's Box to be sure.

>Could increased legal scrutiny of those two platforms have negative side-effects?
I don't think "censorship" is a concern because they aren't saying anything about the material that they're publishing, but I do think that our moderately right wing bias on this board would find the exact details of the filing to be highly distressing.

Fortunately nobody here is capable of basic independent research, so as the only person capable of reading the filing I can comfortably say no there is nothing you'd dislike in this ruling and your only option is to either believe me like a child or to call me a liar because you didn't like thing I said. Or I guess you could find a professional opinion haver paid by only the finest social media tech firms to have an opinion for you. After all, you always have freedom of speech as long as you don't practice freedom of thought. Leave that to the friendly words in the magic talky light up box.

 No.13064

>>13061
I have to agree with >>13062 though I think it's even worse than that.
It seems to be argued here that Google did its job well, and was used for a bad purpose.

Google surely shouldn't be at fault any more than the car manufacturer that allowed the shooter to get to his destination without a breakdown.

 No.13065

File: 1711132001726.jpg (332.27 KB, 1352x1235, 104:95, Screenshot_20210301-190906….jpg) ImgOps Exif Google

Free speech in the U.S. is a negative liberty meaning that the government guarantees that it will not pass any law that preemptively makes any speech or expression a criminal act.

It's not a positive liberty, no one is guaranteed a platform or nor is anyone preemptively protected from the social consequences of one's speech, especially not dangerous ones like inciting violence or inciting a riot, you have the right to say those things, but once they lead to something destructive, you are not protected.

As far as I am concerned, the primary reason for free speech is for the sake of democracy, which means everyone should have the right to criticise those with power. That's the primary justification for it. It's not, nor should it be so absolute that one should be protected from all liability for consequences to others without the power granted by the political process or the free market.

I reject any idea that all forms of speech should be consequence-free on by any sort of "natural rights" philosophy, cause that can be a naturalistic fallacy. Just because I am physically capable of sticking a knife directly into another's chest doesn't mean I should be protected from consequences of doing so. Likewise just because I am physically capable of saying whatever I want doesn't mean I should be protected from committing fraud just because fraud involves using my ability to speak and write to deceive and manipulate others doesn't mean I should be protected from liability for it if I do so

I'm not a free speech absolutist, I prefer to say I am a free expression absolutist because that was what was the point behind encoding it as part of the first amendment and it's also the reason the government doesn't intervene to protect anyone from civil liability for the consequences of any other kind of speech. It's why no matter how he or his fans try to spin it, Alex Jones was not persecuted for expressing himself when he repeatedly, for years, asserted that the Sandy Hook shootings were a false flag operation and that every survivor was an actor. He was sued by those survivors because of the consequences to their daily lives at the hands of Info Wars fans who believed it. Plus it wasn't even a criminal trial, it wasn't the government convicting him for anything, but it was a civil lawsuit brought by those whom he defamed and whose lives he effectively ruined for nearly a decade after they already had to deal with the tragedy itself.

 No.13066

>>13065
>Just because I am physically capable of sticking a knife directly into another's chest doesn't mean I should be protected from consequences of doing so.
There is obvious distinction between spoken words, and stabbing someone.

Mind that rights are matters of morality, of justification, not simply what you "can" do.

If someone attacks you for what you've spoken, the 'wrong' is on them. You are well justified to defend yourself from their actions.

The issues of things as fraud have nothing to do with the speech; It's entirely to do with the deception.
You could say nothing, and still commit fraud, after all.

 No.13067

>>13065
>It's not a positive liberty, no one is guaranteed a platform
But the government may not coerce others into removing someone for his/her speech.  ("State action doctrine")

>As far as I am concerned, the primary reason for free speech is for the sake of democracy,
I disagree.  I think it applies with equal force to good-faith discourse on politics, religion, philosophy, science, etc.

>everyone should have the right to criticise those with power
But yet you apparently think that the government was right to impose civil liability on Alex Jones for his theory that the Sandy Hook was a false-flag operation conducted by the powerful federal government.

I haven't looked into the Alex Jones case, but normally, for a plaintiff to prevail on a defamation action concerning a subject of public concern, he must show that the defendent acted with actual malice (i.e., that the defendent acted with willful disregard of the truth) and that the allegedly defamatory statements were actually false.

 No.13068

>>13067
>I haven't looked into the Alex Jones case, but normally, for a plaintiff to prevail on a defamation action concerning a subject of public concern, he must show that the defendent acted with actual malice (i.e., that the defendent acted with willful disregard of the truth) and that the allegedly defamatory statements were actually false

And the lawyers for the plaintiffs had mountains of evidence that Jones acted in malice, ironically enough despite every effort by Joned to not cooperate with the discovery process thanks to his lawyers handing over tons of data that Jones originally kept from the defendants lawyers entirely by accident. He didn't give a shit so long as it kept his viewers watching. Interestingly enough that same evidence exonerated Paul Joseph Watson of any malice considering his direct refusal to go along with Jones on pushing the narrative.

 No.13082

>>13062
>>13064
This seems way more like the exploding pen (which is both a grenade and a usable writing device) in the famous spy film Goldeneye:

>

The online platforms deliberately chose to organize themselves so that their framework hosts content that actively incites violence, going way beyond just being hateful to actively recommending what body armor to wear when undertaking a mass shooting, and also consciously steers people to this content.

I would argue that giving somebody an exploding pen (or any other inherently malicious product) would be illegal at least in theory, unless there's some absolutely compelling and clear-cut reason as to why they deserve to be treated like that.

 No.13085

>>13082
>The online platforms deliberately chose to organize themselves so that their framework hosts content that actively incites violence,
Did they, though?
Or did they just make a platform built to allow people, generally, to organize themselves, host content for whomever wishes to use it?

You're prescribing a motive that I do not believe exists. Especially considering this is Google of all things.
Nothing I've seen suggests that this system is inherently only good for one purpose, violence, as you seem to suggest with the pen analogy.

Google is not handing out grenades, here.
Platforms for content and means to organize are not grenades.
In fact, it falls under an umbrella the government can't regulate, as I understand it, thanks to freedom of association.

 No.13090

File: 1711338778086.jpeg (108.4 KB, 1080x1080, 1:1, FdInY1JWAAEo5gY.jpeg) ImgOps Google

>>13082
>content that actively incites violence
Do you have an example of that?  The incitement exception of the First Amendment is pretty narrow.  In particular, it only applies to incitement of imminent lawless action.  See Brandenburg v. Ohio, 395 U.S. 444 (1969).
From https://en.wikipedia.org/wiki/Brandenburg_v._Ohio :
"""
Clarence Brandenburg, a Ku Klux Klan (KKK) leader in rural Ohio, contacted a reporter at a Cincinnati television station and invited him to cover a KKK rally that would take place in Hamilton County in the summer of 1964.[9] Portions of the rally were filmed, showing several men in robes and hoods, some carrying firearms, first burning a cross and then making speeches. One of the speeches made reference to the possibility of "revengeance" against "Niggers", "Jews", and those who supported them and also claimed that "our President, our Congress, our Supreme Court, continues to suppress the white, Caucasian race", and announced plans for a march on Congress to take place on the Fourth of July.[10] Another speech advocated for the forced expulsion of African Americans to Africa and Jewish Americans to Israel.[11]
...
The U.S. Supreme Court reversed Brandenburg's conviction, holding that government cannot constitutionally punish abstract advocacy of force or law violation.
"""

>recommending what body armor to wear when undertaking a mass shooting
That sounds more like crime-facilitating speech than incitement.

>>13082
Huh?  Lots of inherently dangerous products are legal to sell (and should be, IMHO).  Firearms, ammo, circular saws, angle grinders, etc.

And going back to your specific example, a round of soft-point 5.56 might even be considered an "exploding pencil": you can write with its soft tip (but beware of lead poisoning!) and if you hit it wrong with a nail and hammer, it explodes.

 No.13092

>>13061

Interesting. It looks like the lawsuit is saying that the algorithms and such have become so advanced that they ought to be treated as a defective product. Unfortunately, while I am quite free speech, such companies may need to be held liable somehow, due to the amount of psychological manipulation involved in marketing nowadays. When you co-opt the English language, which is common heritage, for marketing purposes, isn't it sort-of a crime against humanity at that point? We already know YouTube is an echo-chamber and finding new, original, interesting content is near-impossible anymore.

I think the world would be better off if the lawsuit led to the result of something like such algorithms being banned and opt-in only with full disclosure of the risks and limitations of the product.


[]
[Return] [Go to top]
[ home ] [ pony / townhall / rp / canterlot / rules ] [ arch ]