La piattaforma social perde il safe harbour ex § 230 CDA per negligent design (prodotto difettoso) se permette l’uso anonimo

Il distretto dell’Oregon , Portland division, con sentenza 13 luglio 2022, Case 3:21-cv-01674-MO , A.M. v. Omegle.com llc+1, pone un interessante insegnamento.

La piattaforma social perde il safe harbour se il danno ad un utente è causato non solo dal fatto di altro utente , ma anche dal fatto proprio omissivo (anzi, forse è commissivo),  consistente nel design difettoso della propria architettura informatica . Difetto consistente ad es. nel permettere l’anonimato e il non dichiarere/accertare l’età (nel caso, aveva abbinato casualmente maggiorennne e minorenne, risultata poi adescata dal primo).

Astutamente (o acutamente) per bypassare la barriera del § 230 CDA l’avvocato dell’attore aveva azionato la responsabilità del produttore (social) per prodotto difettoso (negligent design della piattaforma).

Quindi non può dirsi sia stato azionata responsabilità per fatto solo del terzo utente.

<< Here, Plaintiff’s complaint adequately pleads a product liability lawsuit as to claims one
through four.
2 Omegle could have satisfied its alleged obligation to Plaintiff by designing its
product differently—for example, by designing a product so that it did not match minors and
adults. Plaintiff is not claiming that Omegle needed to review, edit, or withdraw any third-party
content to meet this obligation. As I will discuss in more detail below, the content sent between
Plaintiff and Fordyce does not negate this finding or require that I find Omegle act as a publisher.
The Ninth Circuit held in
Lemmon that a defendant “allow[ing] its users to transmit usergenerated content to one another does not detract from the fact that [a plaintiff] seek[s] to hold
[the defendant] liable for its role in violating its distinct duty to design a reasonably safe
product.” 995 F.3d at 1092. “The duty to design a reasonably safe product is fully independent of
[a defendant’s] role in monitoring or publishing third party content.”
Id. In Lemmon it was
immaterial that one of the decedents had sent a SnapChat with the speed filter on it. Instead,
what mattered is that the claim treated defendant as a product manufacturer by accusing it of
negligently designing a product (SnapChat) with a defect (the interplay between the speed filter
and the reward system).
In this case, it similarly does not matter that there were ultimately chats, videos, or
pictures sent from A.M. to Fordyce. As I stated at oral argument, it is clear that content was
created; however, claims one through four do not implicate the publication of content. Tr. [ECF
32] at 10:6–11:8. What matters for purposes of those claims is that the warnings or design of the
product at issue led to the interaction between an eleven-year-old girl and a sexual predator in his
late thirties
>>