Vaccine delays: legal solutions for technical problems

Delays in the deliveries of the Astra Zeneca vaccine doses are raising doubts among the public (and among politicians) about whether vaccination plans laid out by governments can be met.

For these reasons, alternative solutions are being considered from many quarters, and recently in Italy the debate seems to have shifted to patents that protect vaccines, which are considered by many to be an obstacle to public health.

Indeed, patents - like other intellectual property rights - are nothing more than monopoly rights granted to their holder for a period of twenty years during which they may be exploited commercially.

As is well known, the scope of these 'exclusive rights' lies in the increased welfare of the community: by rewarding inventors for a limited period of time, the community achieves the dissemination of knowledge, which otherwise would be kept hidden as much as possible.

However, the global health emergency calls for reflection on whether exclusive patent rights can be circumvented by the need to obtain patent-protected vaccines very quickly.

In other words, we should ask ourselves whether, in this situation of absolute necessity - where the health of citizens is at stake - it is reasonable to persist in looking exclusively after the interests of the patent holder rather than those of the community.

Take, for example, the case of the well-known Astra Zeneca vaccine, for which the owner company was unable to secure production on time and in the quantities that were originally planned.

For such reasons, in Italy it has been suggested that companies equipped for this purpose could be granted the extraordinary right to produce this vaccine even if they do not hold the patents, obviously upon payment of a royalty in return.

This solution would allow for avoidance of possible failures by pharmaceutical companies, which would in any case receive financial compensation for their research and inventive activity.

More generally, the proposals made and discussed were as follows:

  1. require large multinational patent-holding companies to grant 'compulsory licences' to companies that can produce vaccines of similar quality;
  2. suspend the efficacy of the patent, so that anyone can use the knowledge protected by it to produce vaccines until the health emergency continues;
  3. expropriate the holders of patents on vaccines by transforming them into state property, against payment of fair compensation.

Such proposals are supported by several legal bases, first of all the international treaty known as the "TRIPS Agreement"[1], which provides for the possibility for the adhering states to impose compulsory licenses for the manufacture and export of medicines to countries that do not have a sufficient production capacity of their own.

The difficulty in using this instrument lies in the fact that it was designed to provide support to developing countries that do not have the domestic resources to buy expensive foreign drugs (the case of the HIV drug in Africa is well known).

Its application appears more difficult in countries such as Italy, which is developing its own vaccine and has the resources to buy vaccines produced abroad.

On this point, mention should also be made of article 141 of the Italian Industrial Property Code[2], which provides for the power of the Italian State to expropriate or otherwise use patents or patent applications for reasons of public utility. The term 'public utility' indeed seems to include the current pandemic crisis.

If the hypothesis of expropriation of the patent seems disproportionate or in any case an excessive compression of the rights of the patent holder, the hypothesis of compulsory licenses granted in favour of third companies seems to us a reasonable and viable way to compensate for the various complications described above.

However, the practical - and therefore technical - aspect of producing vaccines against Covid 19 must also be taken into account. Indeed, there are different vaccine production methods and processes, which correspond to different measures of efficacy (for example, among the vaccines already approved by the competent authorities, Pfizer and Moderna guarantee 94-95 % efficacy, while Astra Zeneca guarantees 82.4 %)[3], as well as different abilities for providing coverage against Covid variants.

At the same time, vaccines are not the subject of a single patent, but of several interlinked patents that often belong to different parties. As a result, third-party companies would have to obtain compulsory licenses from a large number of owners, which would lead to greater organisational difficulties.

In addition, it is clear that the patents in question are based on complex, confidential know-how, which would have to be transferred and disclosed by the owners to the licensee companies as soon as possible so that the vaccine could be produced; this would, however, lead to longer production times and additional organisational difficulties.

In light of the above, the suggested and discussed route of compulsory licensing certainly seems feasible to solve the shortage and related delays in the delivery of vaccines but, in our opinion, there is a need for collaborative conduct between the owner companies and the licensees, rather than open conflict between them, all in order to safeguard the public health priority.

Moreover, the hypothesis being discussed would be immediately operational without the need for new legislative intervention, which often requires the convergence of political forces (often not easy to implement). This circumstance is certainly an incentive to be exploited in order to achieve a desirable result for the whole community within a short span of time.

[1] Articles 31 and 31bis of said Agreement, in its wording updated to the Protocol of 6 December 2005, which was enacted on 23 January 2017.
[2] Legislative Decree No 30 of 2005, as amended.
[3] https://www.aifa.gov.it/web/guest/domande-e-risposte-su-vaccini-mrna; https://www.aifa.gov.it/domande-e-risposte-su-vaccini-vettore-virale.


Internet of Things and Artificial Intelligence: the end or beginning of standard essential patents?

The COVID-19 pandemic has forced everyone into quarantine which in turn also has imposed a re-organisation of personal and working life directly within our homes.

All of this has simply proven and increased our already worrying dependence from IT means and new technologies, the use of which increased exponentially in 2020 in all sectors, even in those where this would have been difficult to imagine (let us consider, for example, court hearings done remotely via audio-video link, school distance learning, etc.).

Similarly, we are also witnessing an ever increasing digital integration in objects, devices, sensors, and daily goods which now have become a part of our everyday life.

With that being said, we should ask ourselves now what impact the current technological revolutions will have within the field of intellectual property and, in particular, within the patent sector.

In our view, the current changes will certainly bring about a rejuvenation in the field of inventions; indeed, to the extent that is of interest for our purposes, it should be noted that thanks to the decisive role of artificial intelligence and the “internet of things”, we may legitimately expect an increase in the filing of so-called standard essential patents.

It is well known that standard essential patents (SEPs) are patents that protect technologies considered to be – indeed – essential for the implementation of standards which are recognised by the relevant standards setting organisations.

These patents are already present within our life more than we imagine and in fact we use them for calling others, sending messages via our smartphone, sending files via e-mail, listening to our music playlists or simply watching our favourite TV series whilst sitting on our couch at home.

Today, the most well-known standards probably would include “Bluetooth”, “WiFi” and “5G” but, as we said above, performing any of the above actions involves dozens of standards each of which is in turn protected by the aforementioned patents.

In a recent communication sent out last November to the European Parliament, the European Commission evidenced the crucial role of standard essential patents in the development of 5G technology and the Internet of Things, for example noting that just for standards of mobile connectivity the ETSI (European Telecommunications Standards Institute) has declared more than 25.000 patent families.

However, in the same communication the Commission also evidenced the difficulties that some businesses encounter in trying to reach an agreement for the grant of licenses with the holders of standard essential patents, which consequently has determined a rise in disputes between rights-holders and users.

Indeed, it’s known that a patent is defined as essential following a sort of self-declaration by its holder to the effect that the patent is necessary and essential for the application of a standard and, therefore, by means of this declaration, the holder is available to grant a license over such patent to those who intend to utilize the relevant standard under so-called “FRAND”, namely conditions that are Fair, Reasonable And Non-Discriminatory.

What occurs in practice is that the holder of the standard essential patent, having ascertained the presence within the market of a product that uses a certain standard, will turn to its producer or distributor and ask the latter to sign a license agreement containing “FRAND” conditions.

At that point the user has no other choice but to accept the license at the conditions that have been proposed by the patent holder; indeed, unlike what happens with patents that are not standard essential where the user clearly may search for alternative solutions that do not infringe the patent, this is not possible with standard essential patents given that they concern standards used for complying with technical provisions that form the basis of millions of products and therefore allow for interoperability between such products.

Moreover, investing in the development of an alternative standard is very expensive (for example, consider the development of a potential alternative to the “Bluetooth” standard), but – even if we should assume the feasibility of developing an alternative standard – consumers would then have to be persuaded to “switch” to a new standard and substitute their devices with new ones.

The risk that this kind of situation may cause distortions within the market and especially instances of abuse on part of the holders of standard essential patents is therefore very high; indeed, those holders may decide the fate of a product within a certain market because they force all operators of that same market to use the standard upon payment of a royalty.

In order to balance the interests at play, the well-known judgment of the Court of Justice in the case of “Huawei v. ZTE” (C-170/13 of 16.07.2015) had already been issued in 2015 and provided for a series of obligations upon holders of standard essential patents, namely, among other things: a) the obligation to guarantee at all times so-called FRAND conditions in favour of potential licensees; b) the obligation of the patent holder to always warn in advance the user of the protected standard by indicating the patent that has been infringed and specifying how such violation has occurred and, if the user fails to cooperate, to commence legal proceedings.

According to the Court of Justice, if these conditions are met then it cannot be held that the holder of the standard essential patent has abused its domination position within the market and therefore no sanction may lie under art. 101 of the TFUE.

However, reality is somewhat different insofar as holders of standard essential patents still have excessive negotiating power vis-a-vis the user of the protected standard. Indeed, as already noted, the essential nature or lack thereof of a patent depends on a self-declaration given by the same holder of the patent which also establishes a “de facto” presumption of “essentiality” of the patent; this further facilitates the holders in legal proceedings because the burden of proof then falls onto the alleged infringer who will have to prove non-interference or the non-essential nature of the patent.

It should also be noted that as of now there are no provisions that protect the weak party, that is the user of the standard essential patent and, indeed, for example there are no reference criteria that clearly define conditions that are fair, equal, and non-discriminatory. In other words, the user cannot verify if the conditions that are proposed by the patent holder are actually “FRAND” and so two options become possible: either accept the conditions or rebel and start proceedings against the patent holder.

Even though the matter of standard essential patents has formed the subject of several judgments and specific calls by the European Commission throughout the years, several questions have been left open and require immediate action by the legislator in order to strengthen legal certainty and reduce the rising number of disputes within this field.

In our opinion, it would be advisable for example to create and establish an independent body that could verify in advance the essential nature of a patent before it is protected as well as to create rules that are specific, effective, and fair capable of regulating the grant of licenses for standard essential patents.

Furthermore, considering the ongoing technological revolution and the consequent increase in the use of such patents, we trust that these reforms will be introduced in a timely manner.


Facebook again in AGCM's spotlight: could an administrative fine be the solution?

On 17 February 2021, the Competition and Market Authority ("AGCM") imposed a fine of 7 million Euros on Facebook Ireland and its parent company Facebook Inc. for having failed to comply with a measure issued by AGCM on 29 November 2018. Moreover, this was the same order according to which the Menlo Park company had previously been ordered to pay a 5 million Euro fine as a result of it failing to inform its users that their personal data would be used for commercial purposes.

This additional measure - which really is just "the fine applied to the fine " - necessarily makes us think about the actual coercive force of the economic fines issued by AGCM as well as the latter's choice on insisting with fines that are similar to those already issued in 2018 despite the ascertained non-compliance - and therefore ineffectiveness - of those same fines (we have already discussed this topic in a previous article, available here).

So, could it be that the time has come to readjust the content and scope of sanctioning measures that are issued to protect the market and free competition, especially when they are aimed at giant online companies? Could one, for example, envisage a measure of the Authority that would temporarily block the online services offered until the company regularizes its position? Or could we witness the temporary removal of an application from the App/Play Store, making it impossible for the company to acquire new users? Or – again – will we see fines that are proportional to the total annual turnover of a company or its group (similar to the “GDPR” style)?

In our opinion, a concrete solution can only lie in the correct balancing of the interests at stake: on the one hand, the private interests of the multinationals who, by means of their social networks, acquire huge quantities of personal data (and, therefore, of “money”); on the other hand, the interests of online users who use the main social platforms on a daily basis, often not just for “scrolling leisure” but also for professional and business reasons.


The Italian Data Protection Authority and the 2021 inspections activity: biometric data, video-surveillance, food delivery and data breach will be in the spotlight between January and June

The Italian Data Protection Authority (DPA) has defined the boundaries of the inspection activity planned for the first six months of 2021. These will include n. 50 inspections to be conducted also by the Italian Finance Police (under delegation by the DPA) and will focus on the verification of compliance with the applicable privacy laws relating to the following matters of general interest:

  1. processing of biometric data for facial recognition also through video surveillance systems;
  2. processing of personal data in the context of the so-called "domestic video surveillance" sector and in the sector of audio/video systems applied to games (so-called connected toys);
  3. processing of personal data carried out by "data brokers";
  4. processing of personal data carried out by companies operating in the "Food Delivery" sector;
  5. data breach.

From this list two big developments emerge: in particular, this year the Italian DPA will extend its inspections also to the processing of biometric data, as well as to the processing carried out through video surveillance systems. These are two areas governed not only by the GDPR and the Privacy Code but also by various guidelines and other legal provisions, as well as by extensive case law.

Let us mention, just for example, the Guidelines of the Italian DPA on biometric recognition and graphometric signature of 2014, the renewed Article 4 of Law no. 300/1970 and Administrative Memo no. 5/2018 issued by the National Labour Inspectorate, the decision of the Italian DPA on video surveillance of 2010 and the recent FAQ on video surveillance of 5 December 2020, the national and EU case law concerning the monitoring of workers and the so-called "defensive controls", Opinion no. 2/2017 of the former Working Party art. 29 ("Opinion 2/2017 on data processing at work") as well as Guidelines no. 3/2019 of the European Data Protection Board (EDPB) on processing of personal data through video devices.

The above considerations lead us to think about the correct and complex task of identifying the privacy requirements to be met by data controllers and processors - i.e. the economic operators; indeed, especially before embarking on an activity involving the processing of biometric data or the use of video surveillance systems, it is necessary to clarify the particular circumstances of the case at issue (identifying the purposes of the processing, the security measures to be adopted, the possible involvement of any third-party providers, etc.) in order to correctly prepare the privacy documents required by the many applicable regulations (possibly with the help of specialized professionals).

Therefore, it will be interesting to analyse the results of the inspection activity of the Italian DPA to understand what will be - three years after the enactment of the GDPR - the level of compliance that the Authority will consider “acceptable” and what is the real level of compliance reached by the companies operating in our country who process special categories of personal data and use video surveillance systems.

Of course, the privacy obligations relating to the processing of biometric data or through video surveillance systems are on top of those generally required for the processing of personal data; consequently, in order to achieve full compliance with the privacy regulations in force, it is necessary not only to regulate particular areas of business activity (such as, for example, video surveillance or biometrics) but also to adopt (or rather, already have adopted) a solid internal privacy structure which - in case of inspections - can prove to the authorities that the processing of personal data carried out fully complies with the relevant legal provisions.

With particular reference to video surveillance, we would like to remind you that our Firm has developed and published on its website the quick and useful Guidelines for the installation of video surveillance systems, updated with the latest Italian and European regulations. You can consult the Guidelines here.


Sixthcontinent case: can e-commerce platforms “satisfy” legitimate expectations of users?

On 18 January 2021, the Italian Competition Authority (AGCM) announced that it imposed a fine of 1 million euros on the digital platform "Sixthcontinent" in so far as - without prior notice and arbitrarily - it prevented consumers from using the products, money and other utilities purchased or obtained on said platform including, to the extent that is of interest here, the credits accumulated by virtue of the purchases they made.

Sixthcontinent is primarily an e-commerce platform: the registered user buys on this platform shopping cards via which he can then make purchases online or directly at physical stores. For each new purchase made with a shopping card, the user receives bonuses that can be used in lieu of money to buy additional shopping cards, which allows for considerable savings. The platform in question also functions as a social network in that it allows users to interact with each other and take advantage of their virtual social relationships. Indeed, users obtain points - which may also be used for purchases - if they introduce the platform to new members, or when people connected to them (on the basis of a mechanism similar to that of "friendships" on Facebook) make purchases on the same platform.

Thus, the idea behind this business is to take advantage of the community (in this case a virtual one) by means of accumulating points and credits and then saving on the purchase of essential goods, such as food and fuel.

In essence, the contract that is signed between the platform and the user at the time of his/her registration provides, on the part of the user, the commitment to introduce the platform to as many people as possible, and on the part of the platform, to offer users a virtual environment in which it is possible to save money on purchases made.

What we should pay attention to in this case is precisely the role that the platform takes on and the relationship that it establishes with each consumer. Indeed, seeing as these platforms are proposed as instruments of savings and advantages in the purchase of goods of daily use and that they find their strength in the large adhesion of users (Sixthcontinent counts several million users worldwide), it may be said that they create trust in the public of users that is worthy of protection.

The greater the use of the platform by the user (think of the need to purchase essential goods during lockdown periods), the greater the expectation that is created within the user.

As already noted above, in just a few days the Sitxhcontinent platform rendered the shopping cards unusable, also requiring users to accept a refund in "points" instead of money. The conduct of the Sixthcontinent platform is similar to that of a bank who out of the blue decides to prevent account holders from withdrawing their money or making payments from their account.

In its decision the AGCM goes to great lengths in condemning the platform in question for having unlawfully imposed a refund on the consumer in the form of "points", without therefore having left the consumer free to choose how to receive his or her credit.

As is well known, this conduct is not in line with the rules of the Consumer Code (Legislative Decree no. 206/2005) and it is certainly not the first time that the AGCM has sanctioned an e-commerce platform for this type of conduct.

There is, however, a further insight that may be gathered from that decision, which is the intention of the authority to sanction the deceptive conduct pursued by the platform in presenting users with the alleged convenience of joining the community and its various offers, and then proceeding with the unjustified blocking of the platform itself.

In other words, the sanctioning measure of the AGCM seems to highlight the existence of an actual responsibility of the online platform towards registered users, inasmuch as it presents itself as a virtual environment where it is possible, also thanks to the interactions between registered users, to purchase essential goods at a lower price.

The platform that makes such an offer to the public creates in its users a legitimate expectation of being able to use the platform itself safely and continuously, especially for those purchases that are socially more "sensitive" because they are essential to daily life. Consequently, the aforementioned decision intends to affirm that there is an infringement of this expectation when the service offered through the platform is suddenly and arbitrarily interrupted.

What we have said leads us to reflect on two important aspects.

First of all, it is reasonable to say that there is a general contractual public commitment that an e-commerce platform undertakes towards each registered user and which cannot be arbitrarily disregarded without notice, just like any contract stipulated between two parties (in this case stipulated between two parties online).

On the other hand, it is reasonable to state that a continuous provision of services offered online, like any other service, creates a legitimate expectation towards the community/registered users, which as such can be protected and enforced by means of civil and administrative remedies offered by the legal system.


Guidelines for the installation of video surveillance systems

Updated to the Italian Data Protection Authority’s FAQs of 5 December 2020 and to the EDPB's Guidelines no. 3/2019 on processing of personal data through video devices


GENERAL RULES

  1. To comply with the principle of "data minimization": the data controller must choose the video surveillance systems to be installed and the relocation the cameras on the basis of the specific purposes of the processing and must collect and process only personal data that is relevant and not excessive for such purposes.
  1. No prior authorization by the Italian Data Protection Authority is needed for the installation of video cameras, but the data controller must carry out an independent assessment on the lawfulness and the proportionality of the processing, taking into account the context and the purposes of the processing itself as well as the risks to the rights and freedoms of physical persons.
  1. A privacy notice must be provided to the data subjects: both in a short form (through a special sign clearly visible to those passing through the monitored area - the model of this sign is available on the website of the Italian Data Protection Authority - https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9496244) and in an extended form.
  1. An autonomous assessment of the preservation periods of the images must be carried out by the data controller (in accordance with the principle of "accountability" set forth by the GDPR), considering the context and purpose of processing, as well as the risk to the rights and freedoms of physical persons. This without prejudice to the specific provisions of law that determine how long the images should be stored in particular circumstances.
  1. A DPIA must be drafted when new technology cameras or "integrated" and/or "intelligent" video surveillance systems are installed (which, for example, detect, record and automatically report anomalous behaviours or events to the competent authorities), in case of a systematic monitoring of a publicly accessible area on a large scale (e.g., highways, large shopping malls) and in the other cases provided for by articles 35 and 36 of the GDPR and by provision no. 467/2018 of the Italian Data Protection Authority.

 

SPECIFIC CONTEXTS

 


WORKPLACE
(art. 4 of Italian Law no. 300/1970)

Purposes of processing: organizational and production needs, work safety and protection of company assets.
If the employer can remotely monitor the employees' activities through the video cameras:

  • an agreement with the company trade union representatives (RSA/RSU) or the prior authorization from the National Labour Inspectorate is required;
  • it is mandatory to carry out the DPIA;
  • it is necessary to draft internal policies to be provided to the employees describing in a clear and transparent manner the methods of use of the working tools (pc, smartphone, etc.) and the possible controls that the employer can carry out over the employees;
  • it is necessary to comply with the privacy obligations under the GDPR and the Privacy Code;

***


PRIVATE PROPERTY / BUSINESS PREMISES

Purposes of processing: monitoring and protection of private property or business premises, prevention of theft and/or vandalism, etc.
Specific conditions to be met:

  • limitation of the angle of the video cameras to the areas of exclusive pertinence, excluding common areas (courtyards, ground floors, etc.) or areas belonging to third parties;
  • prohibition on film public areas or areas of public transit;

 If specific "home" cameras (so-called "smart cams") are installed within your home, it is necessary to:

  • inform any employees (housekeepers, carers, etc.) of the presence of the video cameras;
  • avoid monitoring the environments that would damage the persons’ dignity (such as restrooms, locker rooms, etc.);
  • protect adequately with appropriate security measures the personal data collected or that can be acquired through the smart cams.

***


CONDOMINIUM

Purposes of processing: monitoring and protection of the common parts of the building and in general of the individual properties.
Specific conditions to be met:

  • pursuant to art. 1136 of the Italian Civil Code, a prior deliberation of the condominium meeting is necessary;
  • the maximum period for the storage of the images is 7 days from the collection (unless there are other proven needs to extend such deadline).

 ***


PARTICULAR CATEGORIES OF PERSONAL DATA
(hospitals and clinics)

Purposes of processing: protection of the patients’ health, monitoring of particular hospital departments, etc.
If the video surveillance is used to collect particular categories of data (e.g., to monitor the patient's health), it is necessary to:

  • check the existence of a legal basis for the processing under art. 9 of the GDPR (such as, for example, the provision of health care or treatment, ensuring high standards of quality and the safety of health care, etc.).
  • pay special attention so that the collection of personal data is limited to only that data necessary for the purposes of the processing ("minimization");
  • carry out the mandatory DPIA if the processing of personal data concerning patients, disabled persons, mentally ill persons, minors and the elderly is not occasional;
  • constantly monitor the security measures (data storage systems and access to data) applied to the processing.

***


CIRCULATION OF VEHICLES

Purposes of processing: assessment and detection of the violations of the highway code.
Specific conditions to be met:

  • limitation of the relocation and angle of the video cameras to the areas necessary for the detection of violations;
  • deletion/obscuration of any images collected but not necessary for the purposes of the processing (e.g., images of pedestrians or other road users, passengers present in the vehicle, etc.);
  • performance of the mandatory DPIA in case of processing of personal data on a large scale (e.g., highways) to monitor drivers' behaviour.

***


MUNICIPAL LANDFILLS

Purposes of processing: control and monitoring of hazardous substance landfills and "eco station" (checking the type of waste dumped, the time of deposit, etc.).
Limitations:

  • only a public body/entity (not a private person/entity) is allowed to conduct the monitoring;
  • the monitoring is permitted only if alternative tools and controls are not possible or not effective for reaching the same purposes.

***


EDUCATIONAL INSTITUTES

Purposes of processing: protection of the building, of school properties therein, of staff and students, protection from vandalism, etc.
Specific conditions to be met:

  • the video cameras that capture the interior of the institutes can only be activated during closing hours, therefore not during school and extracurricular activities;
  • if the video cameras collect the images of the areas out of the school, the angle of the video cameras must be properly limited.

***


URBAN SAFETY

Purposes of processing: protection of urban safety of the public places or of the areas open to the public.
Specific conditions to be met:

  • storage of the images for a maximum period of 7 days after the collection, unless there are special preservation needs for a longer period (art. 6, para. 8 of Law Decree no. 11/2009).

***


VIDEO SURVEILLANCE FROM HIGH ALTITUDES

Data protection laws do not apply to the processing of personal data that does not allow for the identification of physical persons, either directly or indirectly, such as in the case of the video surveillance carried out from high altitudes (for example, using drones or similar) or in the case of fake and/or switched-off cameras.


Smart printers and smart objects: friends or foes?

On 9 December 2020 the Italian Antitrust Authority (Autorità Garante della Concorrenza e del Mercato, also known by the acronym “AGCM”), among other things inflicted a fine of 10 million euros on HP Inc and HP Italy S.r.l. (hereinafter "HP") for two different commercial practices relating to HP-branded printers which were considered to be unfair. For the full text of the measure, see the following link: https://www.agcm.it/dotcmsdoc/allegati-news/PS11144_chiusura.

Firstly, the Authority sanctioned the companies in question for not having correctly informed customers of the installation in their printers of a software that allowed printing only with HP toners and cartridges, while preventing the use of non-original refills.

The second conduct that the AGCM considered punishable consisted in the recording - via firmware present on HP printers and without the knowledge of consumers - of data relating to the specific cartridges being used (both original and non-original): this data was used both to create a database useful for formulating commercial strategies and denying assistance to printers that had used non-original cartridges, thus hindering the exploitation of the legal guarantee of conformity.

With reference to the latter conduct, it is interesting to note how this is a case of distorted use of the so-called "Internet of Things". In fact, this expression means "network of physical objects that contain embedded technology to communicate and sense or interact with their internal states or the external environment." (https://www.gartner.com/en/information-technology/glossary/internet-of-things).

Although in this case the technology used by HP was limited to the collection of information relating to the use of printers, it is clear that the significant presence of objects capable of recording and transmitting data on our daily behaviour could have disturbing implications. The concern comes not just from the possibility that data collections may occur without our knowledge, but also and especially from the uses and purposes that motivate companies to use such data.

Of course, the positive implications that a constant flow of information from objects could provide cannot be ignored, for example, when considering the efficiency and improvement of production chains, and of safety systems for citizens (think of "intelligent traffic lights"). However, cases like the one examined by the AGCM lead us to think about the possibility that these technologies may excessively limit consumers' rights.

From the present case, it is therefore possible to learn a lesson, namely that, first of all, before proceeding with the purchase of a “smart” object it is certainly advisable to acquire as much information as possible on the type of sensors and detectors that may be incorporated in such devices and especially to ascertain what will be the use of data acquired by these devices.

Furthermore, it is certainly appropriate to ask within what limits the use of these “smart” devices may support innovation and the improvement of society, as opposed to when – on the other hand – such use can compromise the rights of consumers, understood both as the right to be informed and the basic rights which arise following the purchase of a product (let us think about the limitations on the exercise of the above mentioned legal guarantee).


Caterina Bo joins the editorial board of MediaLaws

We are happy to announce that our Caterina Bo has joined the editorial board of MediaLaws for the year 2020/2021.
The editorial board monitors relevant news in the field of media law and contributes to the publication of comments and analyses for the blog associated with the specialised journal “Rivista di Diritto dei Media”, born from the initiative of Professors Oreste Pollicino (Università Commerciale Luigi Bocconi), Giulio Enea Vigevani (Università degli Studi di Milano-Bicocca), Marco Bassini (Università Commerciale Luigi Bocconi) and of the lawyer Carlo Melzi d'Eril.

#medialaw #insight #news


iPhone: neither water nor AGCM “resistant”

THE CASE

On November 30, 2020 the Italian Competition and Markets Authority (Autorità Garante della Concorrenza e del Mercato, also known by the acronym “AGCM”) inflicted a 10 million Euro fine on the companies Apple Distribution International and Apple Italia s.r.l. (hereinafter "Apple") for spreading promotional messages via which they exalted the water resistance of different models of iPhone, whilst failing to specify that this property was true only under certain circumstances which do not correspond to the normal conditions of use experienced by consumers (so-called "misleading[1]" commercial practice).

In addition, these promotional messages were contradictory due to the presence of the following disclaimer: "The warranty does not cover damage caused by liquids”. In fact, in the after-sales phase Apple denied repairs of iPhones that were damaged as a result of exposure to water or other liquids, thus hindering the exercise of warranty rights protected by the Consumer Code (a so-called "aggressive"[2] commercial practice).

WHAT CONSEQUENCES FOR THE FINED COMPANY?

The present case allows us to focus attention on the consequences befalling the company that is subject to measures adopted by an authority (in this case, the AGCM).

Firstly, such a measure will have repercussions on the very complex relationship of trust that is established between a business and its consumers.

Indeed, it is well known that a trademark suggests to the consumer, in the very moment of purchase, that the product marked by that sign originates from a certain company.

A psychological and emotional relationship is formed between the consumer and the company which - by its very nature - is easily influenced by external circumstances.

It is precisely in this delicate context that the punitive measure of AGCM against Apple acquires relevance since it makes the basis of this relationship - namely trust and reliability - vulnerable.

It should not be forgotten that the basis of this relationship is inherently mnemonic, meaning that it relies on the (positive) memory that the consumer recalls and preserves with respect to a company, its products, and its services. The punitive measure of the authority precisely affects this memory because, among other things, it aims to warn the consumer of future commercial behaviours.

Indeed, it is easy to imagine how nowadays the news of this measure is spreading quickly through social networks, thus reaching a considerable portion of the company’s clients. On this point, one can understand the AGCM’s decision to force Apple to also publish the measure in the section of its website dedicated to the sale of iPhones, under the heading "Information for consumer protection".

This has greater impact on the company in comparison to an economic penalty, in so far as it damages its image and suggests that the consumer pay more attention when planning on purchasing products from the company affected by the punitive measure.

All of this translates into additional "invisible" costs for the company, i.e. costs that the company will have to bear in the following months in order to rebuild the relationship of trust/reliability with its customer (so-called "reconstructive advertising"). That is without forgetting the corrective activities (and related costs) that, following the warning received from the Authority, the fined company will have to put in place in relation to products already or soon due to be introduced in the market.

From a different point of view, the AGCM’s action also conveys a financial penalty.

In the case in question, the penalty that has been imposed, while constituting the maximum amount provided for by current legislation, nevertheless represents less than 5% of the total turnover achieved by the Apple group in the year 2019, amounting to approximately € 231.57 billion.

Therefore, it is legitimate to wonder if this penalty could have a real deterrent effect.

The answer is surely negative. However, attention should be given to the criterion of determination of the amount of the fine.

For example, if the alleged water resistance of the iPhone had been the only reason that brought the consumer to buy an Apple rather than a Samsung product, would a sanction of just 10 million Euros be appropriate?

Evidently not and, in fact, in that case the entire cost of an iPhone (about 1,000 Euros) would need to be taken into account and the penalty would need to be defined as a percentage of the turnover generated by the (unfair) sale occurred within the territory of the competent authority.

It follows that a predetermined criterion for the quantification of a penalty - such as that provided by the Consumer Code for unfair practices - is in itself insufficient to assess all the circumstances of the case and consequently to appropriately punish the commercially unsound conduct of a business.

FINAL CONSIDERATIONS

The impression is that companies, especially those with dizzying sales, are underestimating the importance of maintaining greater transparency towards the market, perhaps in the mistaken belief that certain practices will go unnoticed. On the contrary, as we have just seen, such practices have negative consequences on the relationship with their customers. Moreover, in fields where competition is fierce, the risk of the client choosing a competitor is always around the corner.

However, punitive discipline may not be strict enough to have the desired deterrent effects. Of course, in the end it is the consumer who is at a disadvantage by not receiving adequate protection.

It is therefore reasonable to inquire whether there may be other punitive methods other than financial ones that can more adequately protect the consumer’s interests.

For example, consideration may given to the possibility of introducing different restrictive measures (which would have a more practical scope and be proportionate to the market-share held by the company) against businesses responsible for implementing unfair commercial practices.

[1] Unfair practices are defined as deceptive when they represent elements and/or features of a product that do not correspond to the truth.
[2] Unfair practices are defined as aggressive when they consist of harassment, coercion or other forms of undue psychological conditioning of consumers.


Facial Recognition and Digital World: technology at the service of convenience?

How many of us unlock our smartphones, make an online payment, authorize the download of an app and/or access a web portal simply by bringing the mobile device closer to our face? How easily do we "tag" our friends in our pictures on the most well-known social networks? And again: how many and what advantages may be obtained from knowing the number of passers-by who stop, even if just for a moment, to look at a billboard?

Statistics show that facial recognition technology is at the service of a digital world that "runs" faster and faster and which forces us to keep up with the times. But at what price for the protection of our personal data?

1. Introduction

Social networks, e-commerce websites, online magazines, home banking and mobile apps: there are millions of digital services available online that we can use through the creation of personal accounts.

When creating profiles, the most widespread trend, especially among young people, is to rely on easy and intuitive passwords (such as date of birth or first name) which are not so secure from an IT point of view and often identical for all the services those people use[1].

In order to deal with these bad habits - which only feed the already high number of data breaches – it has now become common to use so-called "facial recognition" technology (in Italian, "riconoscimento facciale"): this is a type of IT process that associates the features of a person's face with a digital image and stores that image in an electronic device for the purpose of reusing it not only as a means of identification but also for the authentication, verification and/or profiling of individuals.

But is it really always safe to rely on facial recognition? Does a biometric system always guarantee sufficient protection of our personal data?

2. The most frequent uses of facial recognition technology

It’s well known that different biometric techniques lend themselves to being used mainly in the IT context (for example, for authenticating access to a device) and the trend of the main high-tech companies is to invest ever greater amounts of money in this field.

However, facial recognition is also used outside the digital world: take for example the use of biometric systems for the control of physical access to reserved areas, for the opening of gates or for the use of dangerous devices and machinery.

But that's not all. Facial recognition techniques are also capable of serving public authorities and even research. The police in New Delhi has in fact tested facial recognition to identify almost 3,000 missing children; some researchers have used it to detect a rare genetic disease found in subjects from Africa, Asia and Latin America[2].

Faced with such a large number of uses of facial recognition, it is worrying that in our country a specific national legislation on this matter has not yet been enacted. Indeed, agreeing to the detection and collection of the features of our face by a data controller means sharing with the latter a wide range of personal data and exposing ourselves to the processing that the controller decides to make of such data.

Think about a simple "selfie" made with our smartphone: in these cases our device collects our personal image and stores it in a memory. Or again think about passing in front of billboards that detect our presence, the measurement of our body temperature using video and digital thermometers or the boarding systems with video-recognition installed in the largest airports of the world.

3. A quick vademecum for the processing of biometric data

The biometric characteristics of a face that allow for the unique identification of a natural person fall within the notion of "biometric data" provided by European Regulation no. 679/2016 ("GDPR")[3]. In fact, biometric data is defined by the GDPR as data "resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person"[4]. This means that an image / a photograph is not always qualifiable as biometric data if it is not processed through specific technical means that allow for the unique identification or authentication of a natural person[5].

Biometric data also fall within the category of "special categories of personal data" pursuant to art. 9 of GDPR (referred to by art. 2-septies of Legislative Decree no. 196/2003 - "Privacy Code") and can be processed only when the data controller complies with certain legal obligations. Let's try to list some of these obligations here below:

A. Compliance with the fundamental principles of the processing. In an increasingly digital world, the principles of "privacy by design" (data protection by design) and "privacy by default" (data protection by default) provided for by art. 25 GDPR play a leading role[6]. In order to comply with these principles, starting from the design and definition phases of the processing tools the data controllers who use facial recognition for the processing of personal data must provide adequate security measures to ensure the protection of fundamental rights and freedoms of individuals as well as the compliance with the principles set out in Article 5 of GDPR.

Specifically, attention should be paid to the principle of "data minimization" which requires the data controller to configure a biometric recognition system in order to collect and process only a limited number of information, excluding the acquisition of additional data that is not necessary for the purpose to be achieved in the specific case (for example, if the purpose of the processing is that of computer authentication, biometric data should not be processed in such a way as to infer any information of a sensitive nature belonging to the data subject including, for example, clearly visible skin diseases).

B. Information notice. The data controller must provide the data subjects with a privacy notice in accordance with art. 13 of GDPR, which, in a clear and transparent manner, indicates the purposes of the processing, the security measures that have been adopted, the possible centralization of the biometric data that has been collected, the storage periods of the personal data. In this regard, it is appropriate to point out that, as clarified by the Italian data protection Authority[7], such privacy notice has to be delivered before the so-called "enrolment" phase which take place before the creation of a biometric sample[8].

C. Legal basis of the processing. The data controller must ask for the prior consent of the data subjects in order to process their biometric data, or alternatively the data controller should assess the possibility of relying on another legal basis under Article 9 of the GDPR (including, for example, the existence of reasons of public interest in the area of public health, such as the protection against serious cross-border threats to health).

D. DPIA. As provided for by art. 35 of the GDPR and Annex 1 to Provision no. 467/2018 of the Italian data protection Authory, the data controller must assess the impact of the processing of biometric data and specifically assess the risks that such processing may entail for the rights and freedoms of individuals and, at the same time, identify the security measures adopted and to be adopted to address these risks.

E. Appointment of the data processor. Where the data controller engages a third party for the processing of biometric data, the latter must be appointed as "data processor" pursuant to art. 28 of GDPR, following the verification of the third-party's possession of suitable guarantees for the protection of the rights of the data subjects whose biometric data is processed.

F. The implementation of alternative systems. The data controller must offer alternative solutions that do not involve the processing of biometric data, without imposing restrictions or additional costs to the data subject. Such alternative solutions are necessary especially for those who are not able to comply with the constraints imposed by a biometric system (think about a disabled person who is not able to reach, with his face, the height of a thermoscanner) and in case such device is unavailable due to technical problems (for example, in case of malfunction).

4. Conclusions

The applicable data protection regulations are not and should never be considered as an obstacle to the development of new technologies applied to the IT and digital industry. On the contrary, compliance with existing legislation should be an incentive for creating practical solutions in a way that respects the confidentiality of our information.

This should also be the case for facial recognition technology, in relation to which it is important to make users aware of the security of the processing of their personal data. Also because generating awareness means gaining trust from consumers, which is the first step for a correct marketing strategy.

Just as Apple has done with the recent update to "iOS 14" which allows the owners of the latest mobile devices to know - through different color indicators (green and orange) that appear on the status bar of the device - if an installed app is using the camera and then detecting the user's image.

On the other hand, the protection of our personal data must never be sacrificed. And to do this, in our opinion, it is essential that our country enact regulations governing this technology. The added values that facial recognition is able to provide to our economy are in fact under the eyes of everyone for a long time, but if we do not act at the regulatory level in the short term the risk is to have to face in a few years the development and uncontrolled use of these technical solutions, with the consequence of having to spend time and economic resources to solve multiple problems rather than bringing about new advantages.

 

[1] This is confirmed by an interesting (and worrying, for all of us) study that was published during the “Safer Internet Day”, according to which more than half of Italian millennials (55%) uses the same password to access different services and 19% uses extremely simple passwords such as a numbered sequence.

[2] Also noteworthy is the new project "Telefi" funded by the European Commission and called "Towards the European Level Exchange of Facial Images" (TELEFI). It is a study on the benefits that the use of facial recognition can provide to crime investigation in EU Member States and the exchange of data collected within the "Prüm" system, through which DNA, fingerprints and vehicle registration data are exchanged between EU countries to combat cross-border crime, terrorism and illegal migration.

[3] Classic examples of biometric data, in addition to the characteristics of the face, are: the fingerprints, handwritten signature placement dynamics, the retinal vein pattern, the iris shape, the characteristics of the voice emission.

[4] See, for more details, the Opinion of the Working Party ex art. 29 (now replaced by the “European Data Protection Board”) no. 2/2012 - https://www.pdpjournals.com/docs/87997.pdf.

[5] See Recital no. 51 GDPR.

[6] See “Guidelines no. 4/2019 on Article 25 Data Protection by Design and by Default” - Version 2.0 Adopted on 20 October 2020.

[7] See on this matter “Guidelines on biometric recognition and graphometric signature” issued by the Italian data protection Authority on 12 November 2014.

[8] With the term "enrolment" it is understood the process through which a subject is accredited to the biometric system, through the acquisition of one of its biometric characteristic. Indeed, to enable biometric recognition is necessary to acquire the biometric characteristic by way of a procedure ensuring that biometric enrolment is performed appropriately, that the link with the capture subject is retained, and that the quality of the resulting biometric sample is safeguarded. Generally, the facial biometric sample is used to extract, via algorithms that are sometimes based on so-called “neural networks”, a given set of features such as the location of eyes, nose, nostrils, chin and ears in order to build up a biometric template.