Interessante studio del Parlamento Europeo sui prezzi personalizzati

Il microtargeting, permesso dai software guidanti le grandi piattaforme digitali, permette di fare offerte di beni o servizi personalizzate sul singolo cliente seppur sempre in qualche misure categorizzate (anche se categorie via via più ristrette, al punto che può riuscire difficile individuarne una, apputandosi titalmente sule idiosincresie di un singolo).

Il Parlamento Ue pubblica lo studio European Parliament, Directorate-General for Internal Policies of the Union, Rott, P., Strycharz, J., Alleweldt, F., Personalised pricing, Publications Office of the European Union, 2022,   (link diretto qui).

E’ di sicuro interesse, essendo  il tema assai nuovo e potendo scuotere le fondamenta economico-giuridiche dello scambio tra privati, cui eravamo abituati.

Son ravvisati tre livelli di personalizzazione: < Price personalisation can take different forms, namely first-degree personalisation (based on personal characteristics of individual consumers), second-degree price personalisation (based on the quantity of products, e.g. when several bottles are sold in one package) and third-degree personalisation (based on membership in a market segment or consumer group, e.g. student rebate), and can be presented as a different price or a personalised discount. First-degree price personalisation is the most problematic of the three forms. It bases on the consumers’ willingness to pay, that can be inferred from different types of personal data processed on individual or aggregated level. Subsequently, a price matched to the willingness to pay is offered either automatically through algorithmic processing or non-automatically through human intervention >

Non c’è bisogno di ricordare che è il first degree a presentare problemi.

Il dovere di informare sulla personalizzazione (novello art. 6.1.ebis dir. 2011/83; lo studio dice invece <art. 6.1.ea>, ma sul sito Eurlex la versione consolidata della dir. accoglie la prima) varrà ben poco, dato il mare informativo in cui qualòunque dato oggi è annegato.

Ragiona naturalmente anche sull’art. 22 GDPR , inerente alle decisioni individuali automatizzate.


<< This leads to the following conclusions. As price personalisation is expected to become more widespread in the near future and has already occasionally proven to occur, there is a need for regulating this phenomenon.
Given the general rejection by consumers of personalised pricing, regardless of potentially being offered lower or higher prices, and the likelihood of overall consumer detriment of such practices, one could consider prohibiting personalised prices in the form of first degree price discrimination that lead
to a higher than the regular price. At least, there are certain areas where personalised pricing should be prohibited, namely, universal service obligations in areas such as electricity, gas and telecommunications where everyone should have access to services of general interest at the same conditions.
Moreover, while anti-discrimination laws limit the way in which personalised pricing can be performed in that they prohibit the inclusion of certain criteria in the personalisation process (e.g. sex, race, colour, ethnic or social origin, etc.) certain ‘sensitive’ criteria are currently not covered. These could be prohibited to be used for the personalisation of prices, including health conditions, and vulnerabilities such as anxieties that should not be exploited.
Otherwise, information obligations regarding personalised pricing could be extended to all goods and services and to offline or hybrid situations, and information provided should be ‘meaningful’, a notion well-known from data protection law. Thus, traders would have to disclose how prices are personalised
and what criteria are used to do so. Moreover, traders should be required to place information on personalised pricing next to the price in such a way that it cannot be overlooked.

Enforcement should be facilitated through the reversal of the burden of proof once there is an indication of price personalisation. Competent authorities could be granted access to the algorithm
that is used.