Disciplina speciale UE in arrivo per i provider “to prevent and combat child sexual abuse”

La Commissione UE l’11 maggio 2022 ha proposto un regolamento contenente  <<rules to prevent and combat child sexual abuse>> COM(2022) 209 final – 2022/0155 (COD): qui la pagina e qui il link diretto al testo .

Segnalo solo la parte relativa agli obblighi per i provider (capitolo II “OBLIGATIONS OF PROVIDERS OF RELEVANT INFORMATION SOCIETY SERVICES TO PREVENT AND COMBAT ONLINE CHILD SEXUAL ABUSE”); la loro individuazione (campo soggettivo di applicazione) rinvia largamente al digital services act (v. la  bozza)

  • dovranno i) fornire un risk assessment e ii) adottare misure di loro contenimento. Le seconde dovranno rispondere ai requisiti (fumosi) dell’art. 4.2
  • obblighi di indagine/ispezione, art. 10: <<Providers of hosting services and providers of interpersonal communication services that have received a detection order shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46. 2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection order. The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.>>
  • gravosi i conseguenti doveri precisati al § 4 , art. 10:<<The provider shall:(a) take all the necessary measures to ensure that the technologies and indicators,as well as the processing of personal data and other data in connection thereto,are used for the sole purpose of detecting the dissemination of known or newchild sexual abuse material or the solicitation of children, as applicable, insofaras strictly necessary to execute the detection orders addressed to them;(b) establish effective internal procedures to prevent and, where necessary, detectand remedy any misuse of the technologies, indicators and personal data andother data referred to in point (a), including unauthorized access to, andunauthorised transfers of, such personal data and other data;(c) ensure regular human oversight as necessary to ensure that the technologiesoperate in a sufficiently reliable manner and, where necessary, in particularwhen potential errors and potential solicitation of children are detected, humanintervention;  d) establish and operate an accessible, age-appropriate and user-friendlymechanism that allows users to submit to it, within a reasonable timeframe,complaints about alleged infringements of its obligations under this Section, aswell as any decisions that the provider may have taken in relation to the use ofthe technologies, including the removal or disabling of access to materialprovided by users, blocking the users’ accounts or suspending or terminatingthe provision of the service to the users, and process such complaints in anobjective, effective and timely manner;(e) inform the Coordinating Authority, at the latest one month before the start datespecified in the detection order, on the implementation of the envisagedmeasures set out in the implementation plan referred to in Article 7(3);(f) regularly review the functioning of the measures referred to in points (a), (b),(c) and (d) of this paragraph and adjust them where necessary to ensure that the requirements set out therein are met, as well as document the review processand the outcomes thereof and include that information in the report referred to in Article 9(3)>>
  • reporting: sono indicati i dettagli del relativo dovere all’art. 13
  • doveri di rimozione entro 24 ore, art. 14: <<remove or disable access in all Member States of one or more specific items of material >> (notare l’oggteto: specific items)
  • doveri di bloccaggio, art. 16 (articolo importante, come comprensibile, prob. il più imporante assieme a quelli di indagine e di rimozione): << take reasonable measures to prevent users from accessing known child sexual abuse material indicated by all uniform resource locators on the list of uniform resource locators included in the database of indicators [ex art. 44]>>, c.1 (annosa questione del grado di precisione nell’indicazione dei siti).
  • responsabilità, art. 19: Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirements.

Precisaizone praticamente utile , avendo alcuni ipotizzato che il cercare di prevenire eliminerebbe la possibilità di dire <non sapevo>. Teoricamente però inutile sia perchè si tratta di adempimento di dovere giudirico , sia perchè non c’è alcun concorso colposo nell’illecito (hosting di materiale vietato) se si adottano strategie informatiche di contrasto che richiedono magari un certo tempo per la implementazione e l’affinamento.

  • grosso problema sarà quello dei costi attuativi per i provider di minori dimensioni.