No ,naturalmente (secondo il diritto USa), stante il safe harbour ex § 230 CDA (si censura anche l’aver Tik Tok rimosso l’account)
I post “fastidiosi” di Dolan nell’account di Winter non sono stati filtrati adeguatamente da F. che viene citato in giudizio da W..
Ma la corte , al solito, rigetta per il cit. § 230.
Dei tre requisiti (che sia un provider; che si tratti di notizie di terzi; che l’attore lo tratti come publisher o speaker) è sempre il terzo il più ineressante, dipendendo dala prospettazione attorea. La quale talora genera dubbi , ma non qui. Esattametne la corte ravvisa anche il predetto terzo requisito:
<<Lastly, Plaintiffs are clearly seeking to treat Facebook and TikTok as publishers. (See Doc. No. 16 at 5-6; Doc. No. 24 at 5). Under § 230, a cause of action “treat[s]” an entity as a “publisher” if it would hold an entity responsible for “deciding whether to publish, withdraw, postpone or alter content.” Johnson, 614 F.3d at 792. Here, Plaintiffs seek to hold Facebook and TikTok liable for not removing the content/accounts posted/held by Dolan and her group, which they find objectionable.
The decision whether to publish, withdraw, postpone, or alter content are traditional editorial functions of a publisher. Zeran, 129 F.3d at 330; Klayman, 753 F.3d at 1359 (“Indeed, the very essence of publishing is making the decision whether to print or retract a given piece of content – the very actions for which Klayman seeks to hold Facebook liable.”).
Plaintiffs also seek to hold TikTok liable for removing their social media accounts from its platform, yet courts have found that § 230(c)(1) immunity also applies to the situation where a plaintiff objects to the removal of his or her own content. See Wilson, 2020 WL 3410349, at *12 (citing cases).
In opposition to Facebook and TikTok’s motions, Plaintiffs argue that by failing to follow their own Community Standards and taking down content that violates those standards, both Facebook and TikTok have acted in “bad faith” and thus cannot be protected under § 230(c)(2), the “Good Samaritan” provision of the CDA. (Doc. No. 16 at 6-7; Doc. No. 24 at 6-7). Plaintiffs’ argument is unavailing in that it ignores the fact that Facebook and TikTok are claiming immunity under § 230(c)(1), which contains no “good faith” requirement. E. Coast Test Prep LLC v. Allnurses.com, Inc., 307 F. Supp. 3d 952, 964 (D. Minn. 2018), aff’d, 971 F.3d 747 (8th Cir. 2020).
Moreover, Plaintiffs’ argument rests on a misconception about the purpose of the Good Samaritan provision. It was inserted not to diminish the broad general immunity provided by § 230(c)(1), but to assure it is not diminished by the exercise of traditional publisher functions. See Zeran, 129 F.3d at 331 (Section 230 was designed “to encourage service providers to self-regulate the dissemination of offensive material over their services” without fear they would incur liability as a result.). In this respect, the CDA was a response to cases such as Stratton Oakmont, Inc. v. Prodigy Servs. Co., No. 31063/94, 1995 WL 323710, at *2 (N.Y. Sup. Ct. May 24, 1995), in which an ISP was found liable for defamatory statements posted by third parties because it had voluntarily screened and edited some offensive content, and so was considered a “publisher.” Id. at *4. “Section 230(c)(1) was meant to undo the perverse incentives created by this reasoning, which effectively penalized providers for monitoring content.” Zeran, 129 F.3d at 331. As TikTok aptly notes, Plaintiffs’ complaint about selective enforcement of content standards is precisely the type of claim Congress intended to preempt in § 230(c)(1>>
Anche la domanda ex tort of negligence è rigettata: non c’era specifoco duty of care di F. verso l’attore. Tale non erano le sue Community Standards , altrimenti il safa harbour sarebbe vano: <<To the extent Plaintiffs contend that the promulgation of Community Standards gives rise to a state-law duty of care in publishing, that argument has been rejected. “[S]tate law cannot predicate liability for publishing decisions on the mere existence of the very relationship that Congress immunized from suit.” Klayman, 753 F.3d at 1359-60. The CDA “allows [computer service providers] to establish standards of decency without risking liability for doing so.” Bennett v. Google, LLC, 882 F.3d 1163, 1168 (D.C. Cir. 2018) (quoting Green, 318 F.3d at 723).
Again, “[n]one of this means … that the original culpable party who posts defamatory messages [will] escape accountability.” Zeran, 129 F.3d at 330. It means only that, if Plaintiffs take issue with the posts and accounts of Dolan and her group, their legal remedy is against Dolan as the content provider, and not against Facebook and TikTok as the publisher>>.
(notizie e link alla sentenza dal blog di Eric Goldman)