Facebook again in AGCM's spotlight: could an administrative fine be the solution?

On 17 February 2021, the Competition and Market Authority ("AGCM") imposed a fine of 7 million Euros on Facebook Ireland and its parent company Facebook Inc. for having failed to comply with a measure issued by AGCM on 29 November 2018. Moreover, this was the same order according to which the Menlo Park company had previously been ordered to pay a 5 million Euro fine as a result of it failing to inform its users that their personal data would be used for commercial purposes.

This additional measure - which really is just "the fine applied to the fine " - necessarily makes us think about the actual coercive force of the economic fines issued by AGCM as well as the latter's choice on insisting with fines that are similar to those already issued in 2018 despite the ascertained non-compliance - and therefore ineffectiveness - of those same fines (we have already discussed this topic in a previous article, available here).

So, could it be that the time has come to readjust the content and scope of sanctioning measures that are issued to protect the market and free competition, especially when they are aimed at giant online companies? Could one, for example, envisage a measure of the Authority that would temporarily block the online services offered until the company regularizes its position? Or could we witness the temporary removal of an application from the App/Play Store, making it impossible for the company to acquire new users? Or – again – will we see fines that are proportional to the total annual turnover of a company or its group (similar to the “GDPR” style)?

In our opinion, a concrete solution can only lie in the correct balancing of the interests at stake: on the one hand, the private interests of the multinationals who, by means of their social networks, acquire huge quantities of personal data (and, therefore, of “money”); on the other hand, the interests of online users who use the main social platforms on a daily basis, often not just for “scrolling leisure” but also for professional and business reasons.


The Italian Data Protection Authority and the 2021 inspections activity: biometric data, video-surveillance, food delivery and data breach will be in the spotlight between January and June

The Italian Data Protection Authority (DPA) has defined the boundaries of the inspection activity planned for the first six months of 2021. These will include n. 50 inspections to be conducted also by the Italian Finance Police (under delegation by the DPA) and will focus on the verification of compliance with the applicable privacy laws relating to the following matters of general interest:

  1. processing of biometric data for facial recognition also through video surveillance systems;
  2. processing of personal data in the context of the so-called "domestic video surveillance" sector and in the sector of audio/video systems applied to games (so-called connected toys);
  3. processing of personal data carried out by "data brokers";
  4. processing of personal data carried out by companies operating in the "Food Delivery" sector;
  5. data breach.

From this list two big developments emerge: in particular, this year the Italian DPA will extend its inspections also to the processing of biometric data, as well as to the processing carried out through video surveillance systems. These are two areas governed not only by the GDPR and the Privacy Code but also by various guidelines and other legal provisions, as well as by extensive case law.

Let us mention, just for example, the Guidelines of the Italian DPA on biometric recognition and graphometric signature of 2014, the renewed Article 4 of Law no. 300/1970 and Administrative Memo no. 5/2018 issued by the National Labour Inspectorate, the decision of the Italian DPA on video surveillance of 2010 and the recent FAQ on video surveillance of 5 December 2020, the national and EU case law concerning the monitoring of workers and the so-called "defensive controls", Opinion no. 2/2017 of the former Working Party art. 29 ("Opinion 2/2017 on data processing at work") as well as Guidelines no. 3/2019 of the European Data Protection Board (EDPB) on processing of personal data through video devices.

The above considerations lead us to think about the correct and complex task of identifying the privacy requirements to be met by data controllers and processors - i.e. the economic operators; indeed, especially before embarking on an activity involving the processing of biometric data or the use of video surveillance systems, it is necessary to clarify the particular circumstances of the case at issue (identifying the purposes of the processing, the security measures to be adopted, the possible involvement of any third-party providers, etc.) in order to correctly prepare the privacy documents required by the many applicable regulations (possibly with the help of specialized professionals).

Therefore, it will be interesting to analyse the results of the inspection activity of the Italian DPA to understand what will be - three years after the enactment of the GDPR - the level of compliance that the Authority will consider “acceptable” and what is the real level of compliance reached by the companies operating in our country who process special categories of personal data and use video surveillance systems.

Of course, the privacy obligations relating to the processing of biometric data or through video surveillance systems are on top of those generally required for the processing of personal data; consequently, in order to achieve full compliance with the privacy regulations in force, it is necessary not only to regulate particular areas of business activity (such as, for example, video surveillance or biometrics) but also to adopt (or rather, already have adopted) a solid internal privacy structure which - in case of inspections - can prove to the authorities that the processing of personal data carried out fully complies with the relevant legal provisions.

With particular reference to video surveillance, we would like to remind you that our Firm has developed and published on its website the quick and useful Guidelines for the installation of video surveillance systems, updated with the latest Italian and European regulations. You can consult the Guidelines here.


Sixthcontinent case: can e-commerce platforms “satisfy” legitimate expectations of users?

On 18 January 2021, the Italian Competition Authority (AGCM) announced that it imposed a fine of 1 million euros on the digital platform "Sixthcontinent" in so far as - without prior notice and arbitrarily - it prevented consumers from using the products, money and other utilities purchased or obtained on said platform including, to the extent that is of interest here, the credits accumulated by virtue of the purchases they made.

Sixthcontinent is primarily an e-commerce platform: the registered user buys on this platform shopping cards via which he can then make purchases online or directly at physical stores. For each new purchase made with a shopping card, the user receives bonuses that can be used in lieu of money to buy additional shopping cards, which allows for considerable savings. The platform in question also functions as a social network in that it allows users to interact with each other and take advantage of their virtual social relationships. Indeed, users obtain points - which may also be used for purchases - if they introduce the platform to new members, or when people connected to them (on the basis of a mechanism similar to that of "friendships" on Facebook) make purchases on the same platform.

Thus, the idea behind this business is to take advantage of the community (in this case a virtual one) by means of accumulating points and credits and then saving on the purchase of essential goods, such as food and fuel.

In essence, the contract that is signed between the platform and the user at the time of his/her registration provides, on the part of the user, the commitment to introduce the platform to as many people as possible, and on the part of the platform, to offer users a virtual environment in which it is possible to save money on purchases made.

What we should pay attention to in this case is precisely the role that the platform takes on and the relationship that it establishes with each consumer. Indeed, seeing as these platforms are proposed as instruments of savings and advantages in the purchase of goods of daily use and that they find their strength in the large adhesion of users (Sixthcontinent counts several million users worldwide), it may be said that they create trust in the public of users that is worthy of protection.

The greater the use of the platform by the user (think of the need to purchase essential goods during lockdown periods), the greater the expectation that is created within the user.

As already noted above, in just a few days the Sitxhcontinent platform rendered the shopping cards unusable, also requiring users to accept a refund in "points" instead of money. The conduct of the Sixthcontinent platform is similar to that of a bank who out of the blue decides to prevent account holders from withdrawing their money or making payments from their account.

In its decision the AGCM goes to great lengths in condemning the platform in question for having unlawfully imposed a refund on the consumer in the form of "points", without therefore having left the consumer free to choose how to receive his or her credit.

As is well known, this conduct is not in line with the rules of the Consumer Code (Legislative Decree no. 206/2005) and it is certainly not the first time that the AGCM has sanctioned an e-commerce platform for this type of conduct.

There is, however, a further insight that may be gathered from that decision, which is the intention of the authority to sanction the deceptive conduct pursued by the platform in presenting users with the alleged convenience of joining the community and its various offers, and then proceeding with the unjustified blocking of the platform itself.

In other words, the sanctioning measure of the AGCM seems to highlight the existence of an actual responsibility of the online platform towards registered users, inasmuch as it presents itself as a virtual environment where it is possible, also thanks to the interactions between registered users, to purchase essential goods at a lower price.

The platform that makes such an offer to the public creates in its users a legitimate expectation of being able to use the platform itself safely and continuously, especially for those purchases that are socially more "sensitive" because they are essential to daily life. Consequently, the aforementioned decision intends to affirm that there is an infringement of this expectation when the service offered through the platform is suddenly and arbitrarily interrupted.

What we have said leads us to reflect on two important aspects.

First of all, it is reasonable to say that there is a general contractual public commitment that an e-commerce platform undertakes towards each registered user and which cannot be arbitrarily disregarded without notice, just like any contract stipulated between two parties (in this case stipulated between two parties online).

On the other hand, it is reasonable to state that a continuous provision of services offered online, like any other service, creates a legitimate expectation towards the community/registered users, which as such can be protected and enforced by means of civil and administrative remedies offered by the legal system.


EU - UK agreement (so-called “Brexit”): the birth of the “comparable trademark”

It is well known by now that on 24 December 2020 the European Union and the United Kingdom reached an agreement for regulating their future commercial relations following “Brexit”.

This agreement seals the definitive separation between the British and European legal systems and, starting from the date of its enactment (i.e., 1 January 2021, namely the end of the transition period), the rules of European Union law will no longer apply to the United Kingdom, including those concerning intellectual and industrial property rights.

With a view to ensuring an orderly transition towards the new legal regime, the European Commission published a series of “Notices of Withdrawal” (related to the main sectors of European economy) which set out the main practical consequences that will affect the owners of intellectual and industrial property rights.

In particular, the Notice concerning trademarks specifies that, among other things, the owner of an EU trademark registered before 1 January 2021 will automatically become the owner of a “comparable trademark” in the United Kingdom, resulting as registered and subject to opposition in the United Kingdom, in accordance with the laws of that country.

This notion of a “comparable trademark” appears to be new within the field of IP rights, in so far as it was specifically introduced for the purpose of protecting those who – before the definitive withdrawal of the United Kingdom from the European Union – had obtained protection for their EU trademark which, at the time, produced effects also with respect to British territory.

The European Commission – evidently aware of the novelty of this legal concept – in the same Notice clarified that such “comparable trademark”:

      1. consists of the same sign that forms the object of EU registration;
      2. enjoys the date of filing or the date of priority of the EU trademark and, where appropriate, the seniority of a trademark of the United Kingdom claimed by its owner;
      3. allows the owner of an EU trademark that has acquired a reputation before 1 January 2021 to exercise equivalent rights in the United Kingdom;
      4. cannot be liable to revocation on the ground that the corresponding EU trademark had not been put into genuine use in the territory of the United Kingdom before the end of the transition period;
      5. may be declared invalid or revoked or cancelled if the corresponding EU trademark is the object of a decision to that effect as a result of an administrative or judicial procedure which was ongoing before 1 January 2021 (on a date following the “cloning”).

The British Government has confirmed that it will fall to the competent office of the United Kingdom to proceed without cost with the “cloning” of EU trademarks in the United Kingdom, where they will become “comparable trademarks”. It is not required of the owners of EU trademarks to file any request, nor to commence any administrative procedure in the United Kingdom, nor is it necessary for them to have a postal address in the United Kingdom for the three years following the end of the transition period.

Despite the precise description of the main features of the new “comparable trademarks”, there are – inevitably – uncertainties surrounding the practical application of this legal concept.

In particular, it is puzzling to find that the “comparable trademark” continues to be influenced by European administrative and judicial occurrences (see point e) above), which conflicts with the alleged independence of the United Kingdom from European laws and regulations.

Such inconsistencies evidently have already been noted, in so far as the Notice specifies (in a footnote) that the parties have acknowledged that the United Kingdom “is not obliged to declare invalid or to revoke the corresponding right in the United Kingdom where the grounds for the invalidity or revocation of the European Union trade mark … do[es] not apply in the United Kingdom”. It would seem therefore that the United Kingdom is invested with the power to not conform itself to European decisions.

However, it is not clear what should prevail in this “contest”: the invalidating decision of the European proceedings or British power to deny the effects of such European decision?

Furthermore, if the European proceedings – albeit commenced before the end of the transition period – should last for several years, how should the owner of the “cloned” trademark in the United Kingdom behave? Again, how are we to reconcile the existence of a “comparable trademark” – contemporaneously subject to European and British jurisdiction – with the known principle of territoriality applicable to the world of trademarks?

The situation appears somewhat uncertain and, in our opinion, it cannot be excluded that other issues concerning this new “comparable trademark” may arise in the future and form the object of open debate by those operating in the IP industry.

This is a key point that concerns not just acquired rights (which the British Government has undertaken to protect), but also the future political relations between the EU and the United Kingdom; indeed, it is interesting to note in this regard how such a seemingly innocent subject, namely trademark law, may in fact reveal the frailty of an agreement which is formally commercial but in reality turns out to be predominantly of a political nature.

In light of all the above, it would seem that the EU and the United Kingdom have chosen to follow the easiest path for protecting owners of EU trademarks in the midst of an orderly transition towards a new legal regime imposed by Brexit; on the other hand, this decision raises several legal questions, some of which have been anticipated above, which introduce a measure of uncertainty concerning the new “British hybrid” trademark.

Finally, we cannot fail to note how this new legal concept may represent an interesting precedent in the event that other Member States may decide in the future to leave the European Union. In that regard, we find ourselves before a new concept that certainly seems interesting from a legal standpoint, but potentially may also be “dangerous” from a political point of view and, as such, deserving of close attention in the coming years.


Guidelines for the installation of video surveillance systems

Updated to the Italian Data Protection Authority’s FAQs of 5 December 2020 and to the EDPB's Guidelines no. 3/2019 on processing of personal data through video devices


GENERAL RULES

  1. To comply with the principle of "data minimization": the data controller must choose the video surveillance systems to be installed and the relocation the cameras on the basis of the specific purposes of the processing and must collect and process only personal data that is relevant and not excessive for such purposes.
  1. No prior authorization by the Italian Data Protection Authority is needed for the installation of video cameras, but the data controller must carry out an independent assessment on the lawfulness and the proportionality of the processing, taking into account the context and the purposes of the processing itself as well as the risks to the rights and freedoms of physical persons.
  1. A privacy notice must be provided to the data subjects: both in a short form (through a special sign clearly visible to those passing through the monitored area - the model of this sign is available on the website of the Italian Data Protection Authority - https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9496244) and in an extended form.
  1. An autonomous assessment of the preservation periods of the images must be carried out by the data controller (in accordance with the principle of "accountability" set forth by the GDPR), considering the context and purpose of processing, as well as the risk to the rights and freedoms of physical persons. This without prejudice to the specific provisions of law that determine how long the images should be stored in particular circumstances.
  1. A DPIA must be drafted when new technology cameras or "integrated" and/or "intelligent" video surveillance systems are installed (which, for example, detect, record and automatically report anomalous behaviours or events to the competent authorities), in case of a systematic monitoring of a publicly accessible area on a large scale (e.g., highways, large shopping malls) and in the other cases provided for by articles 35 and 36 of the GDPR and by provision no. 467/2018 of the Italian Data Protection Authority.

 

SPECIFIC CONTEXTS

 


WORKPLACE
(art. 4 of Italian Law no. 300/1970)

Purposes of processing: organizational and production needs, work safety and protection of company assets.
If the employer can remotely monitor the employees' activities through the video cameras:

  • an agreement with the company trade union representatives (RSA/RSU) or the prior authorization from the National Labour Inspectorate is required;
  • it is mandatory to carry out the DPIA;
  • it is necessary to draft internal policies to be provided to the employees describing in a clear and transparent manner the methods of use of the working tools (pc, smartphone, etc.) and the possible controls that the employer can carry out over the employees;
  • it is necessary to comply with the privacy obligations under the GDPR and the Privacy Code;

***


PRIVATE PROPERTY / BUSINESS PREMISES

Purposes of processing: monitoring and protection of private property or business premises, prevention of theft and/or vandalism, etc.
Specific conditions to be met:

  • limitation of the angle of the video cameras to the areas of exclusive pertinence, excluding common areas (courtyards, ground floors, etc.) or areas belonging to third parties;
  • prohibition on film public areas or areas of public transit;

 If specific "home" cameras (so-called "smart cams") are installed within your home, it is necessary to:

  • inform any employees (housekeepers, carers, etc.) of the presence of the video cameras;
  • avoid monitoring the environments that would damage the persons’ dignity (such as restrooms, locker rooms, etc.);
  • protect adequately with appropriate security measures the personal data collected or that can be acquired through the smart cams.

***


CONDOMINIUM

Purposes of processing: monitoring and protection of the common parts of the building and in general of the individual properties.
Specific conditions to be met:

  • pursuant to art. 1136 of the Italian Civil Code, a prior deliberation of the condominium meeting is necessary;
  • the maximum period for the storage of the images is 7 days from the collection (unless there are other proven needs to extend such deadline).

 ***


PARTICULAR CATEGORIES OF PERSONAL DATA
(hospitals and clinics)

Purposes of processing: protection of the patients’ health, monitoring of particular hospital departments, etc.
If the video surveillance is used to collect particular categories of data (e.g., to monitor the patient's health), it is necessary to:

  • check the existence of a legal basis for the processing under art. 9 of the GDPR (such as, for example, the provision of health care or treatment, ensuring high standards of quality and the safety of health care, etc.).
  • pay special attention so that the collection of personal data is limited to only that data necessary for the purposes of the processing ("minimization");
  • carry out the mandatory DPIA if the processing of personal data concerning patients, disabled persons, mentally ill persons, minors and the elderly is not occasional;
  • constantly monitor the security measures (data storage systems and access to data) applied to the processing.

***


CIRCULATION OF VEHICLES

Purposes of processing: assessment and detection of the violations of the highway code.
Specific conditions to be met:

  • limitation of the relocation and angle of the video cameras to the areas necessary for the detection of violations;
  • deletion/obscuration of any images collected but not necessary for the purposes of the processing (e.g., images of pedestrians or other road users, passengers present in the vehicle, etc.);
  • performance of the mandatory DPIA in case of processing of personal data on a large scale (e.g., highways) to monitor drivers' behaviour.

***


MUNICIPAL LANDFILLS

Purposes of processing: control and monitoring of hazardous substance landfills and "eco station" (checking the type of waste dumped, the time of deposit, etc.).
Limitations:

  • only a public body/entity (not a private person/entity) is allowed to conduct the monitoring;
  • the monitoring is permitted only if alternative tools and controls are not possible or not effective for reaching the same purposes.

***


EDUCATIONAL INSTITUTES

Purposes of processing: protection of the building, of school properties therein, of staff and students, protection from vandalism, etc.
Specific conditions to be met:

  • the video cameras that capture the interior of the institutes can only be activated during closing hours, therefore not during school and extracurricular activities;
  • if the video cameras collect the images of the areas out of the school, the angle of the video cameras must be properly limited.

***


URBAN SAFETY

Purposes of processing: protection of urban safety of the public places or of the areas open to the public.
Specific conditions to be met:

  • storage of the images for a maximum period of 7 days after the collection, unless there are special preservation needs for a longer period (art. 6, para. 8 of Law Decree no. 11/2009).

***


VIDEO SURVEILLANCE FROM HIGH ALTITUDES

Data protection laws do not apply to the processing of personal data that does not allow for the identification of physical persons, either directly or indirectly, such as in the case of the video surveillance carried out from high altitudes (for example, using drones or similar) or in the case of fake and/or switched-off cameras.


Smart printers and smart objects: friends or foes?

On 9 December 2020 the Italian Antitrust Authority (Autorità Garante della Concorrenza e del Mercato, also known by the acronym “AGCM”), among other things inflicted a fine of 10 million euros on HP Inc and HP Italy S.r.l. (hereinafter "HP") for two different commercial practices relating to HP-branded printers which were considered to be unfair. For the full text of the measure, see the following link: https://www.agcm.it/dotcmsdoc/allegati-news/PS11144_chiusura.

Firstly, the Authority sanctioned the companies in question for not having correctly informed customers of the installation in their printers of a software that allowed printing only with HP toners and cartridges, while preventing the use of non-original refills.

The second conduct that the AGCM considered punishable consisted in the recording - via firmware present on HP printers and without the knowledge of consumers - of data relating to the specific cartridges being used (both original and non-original): this data was used both to create a database useful for formulating commercial strategies and denying assistance to printers that had used non-original cartridges, thus hindering the exploitation of the legal guarantee of conformity.

With reference to the latter conduct, it is interesting to note how this is a case of distorted use of the so-called "Internet of Things". In fact, this expression means "network of physical objects that contain embedded technology to communicate and sense or interact with their internal states or the external environment." (https://www.gartner.com/en/information-technology/glossary/internet-of-things).

Although in this case the technology used by HP was limited to the collection of information relating to the use of printers, it is clear that the significant presence of objects capable of recording and transmitting data on our daily behaviour could have disturbing implications. The concern comes not just from the possibility that data collections may occur without our knowledge, but also and especially from the uses and purposes that motivate companies to use such data.

Of course, the positive implications that a constant flow of information from objects could provide cannot be ignored, for example, when considering the efficiency and improvement of production chains, and of safety systems for citizens (think of "intelligent traffic lights"). However, cases like the one examined by the AGCM lead us to think about the possibility that these technologies may excessively limit consumers' rights.

From the present case, it is therefore possible to learn a lesson, namely that, first of all, before proceeding with the purchase of a “smart” object it is certainly advisable to acquire as much information as possible on the type of sensors and detectors that may be incorporated in such devices and especially to ascertain what will be the use of data acquired by these devices.

Furthermore, it is certainly appropriate to ask within what limits the use of these “smart” devices may support innovation and the improvement of society, as opposed to when – on the other hand – such use can compromise the rights of consumers, understood both as the right to be informed and the basic rights which arise following the purchase of a product (let us think about the limitations on the exercise of the above mentioned legal guarantee).


Caterina Bo joins the editorial board of MediaLaws

We are happy to announce that our Caterina Bo has joined the editorial board of MediaLaws for the year 2020/2021.
The editorial board monitors relevant news in the field of media law and contributes to the publication of comments and analyses for the blog associated with the specialised journal “Rivista di Diritto dei Media”, born from the initiative of Professors Oreste Pollicino (Università Commerciale Luigi Bocconi), Giulio Enea Vigevani (Università degli Studi di Milano-Bicocca), Marco Bassini (Università Commerciale Luigi Bocconi) and of the lawyer Carlo Melzi d'Eril.

#medialaw #insight #news


iPhone: neither water nor AGCM “resistant”

THE CASE

On November 30, 2020 the Italian Competition and Markets Authority (Autorità Garante della Concorrenza e del Mercato, also known by the acronym “AGCM”) inflicted a 10 million Euro fine on the companies Apple Distribution International and Apple Italia s.r.l. (hereinafter "Apple") for spreading promotional messages via which they exalted the water resistance of different models of iPhone, whilst failing to specify that this property was true only under certain circumstances which do not correspond to the normal conditions of use experienced by consumers (so-called "misleading[1]" commercial practice).

In addition, these promotional messages were contradictory due to the presence of the following disclaimer: "The warranty does not cover damage caused by liquids”. In fact, in the after-sales phase Apple denied repairs of iPhones that were damaged as a result of exposure to water or other liquids, thus hindering the exercise of warranty rights protected by the Consumer Code (a so-called "aggressive"[2] commercial practice).

WHAT CONSEQUENCES FOR THE FINED COMPANY?

The present case allows us to focus attention on the consequences befalling the company that is subject to measures adopted by an authority (in this case, the AGCM).

Firstly, such a measure will have repercussions on the very complex relationship of trust that is established between a business and its consumers.

Indeed, it is well known that a trademark suggests to the consumer, in the very moment of purchase, that the product marked by that sign originates from a certain company.

A psychological and emotional relationship is formed between the consumer and the company which - by its very nature - is easily influenced by external circumstances.

It is precisely in this delicate context that the punitive measure of AGCM against Apple acquires relevance since it makes the basis of this relationship - namely trust and reliability - vulnerable.

It should not be forgotten that the basis of this relationship is inherently mnemonic, meaning that it relies on the (positive) memory that the consumer recalls and preserves with respect to a company, its products, and its services. The punitive measure of the authority precisely affects this memory because, among other things, it aims to warn the consumer of future commercial behaviours.

Indeed, it is easy to imagine how nowadays the news of this measure is spreading quickly through social networks, thus reaching a considerable portion of the company’s clients. On this point, one can understand the AGCM’s decision to force Apple to also publish the measure in the section of its website dedicated to the sale of iPhones, under the heading "Information for consumer protection".

This has greater impact on the company in comparison to an economic penalty, in so far as it damages its image and suggests that the consumer pay more attention when planning on purchasing products from the company affected by the punitive measure.

All of this translates into additional "invisible" costs for the company, i.e. costs that the company will have to bear in the following months in order to rebuild the relationship of trust/reliability with its customer (so-called "reconstructive advertising"). That is without forgetting the corrective activities (and related costs) that, following the warning received from the Authority, the fined company will have to put in place in relation to products already or soon due to be introduced in the market.

From a different point of view, the AGCM’s action also conveys a financial penalty.

In the case in question, the penalty that has been imposed, while constituting the maximum amount provided for by current legislation, nevertheless represents less than 5% of the total turnover achieved by the Apple group in the year 2019, amounting to approximately € 231.57 billion.

Therefore, it is legitimate to wonder if this penalty could have a real deterrent effect.

The answer is surely negative. However, attention should be given to the criterion of determination of the amount of the fine.

For example, if the alleged water resistance of the iPhone had been the only reason that brought the consumer to buy an Apple rather than a Samsung product, would a sanction of just 10 million Euros be appropriate?

Evidently not and, in fact, in that case the entire cost of an iPhone (about 1,000 Euros) would need to be taken into account and the penalty would need to be defined as a percentage of the turnover generated by the (unfair) sale occurred within the territory of the competent authority.

It follows that a predetermined criterion for the quantification of a penalty - such as that provided by the Consumer Code for unfair practices - is in itself insufficient to assess all the circumstances of the case and consequently to appropriately punish the commercially unsound conduct of a business.

FINAL CONSIDERATIONS

The impression is that companies, especially those with dizzying sales, are underestimating the importance of maintaining greater transparency towards the market, perhaps in the mistaken belief that certain practices will go unnoticed. On the contrary, as we have just seen, such practices have negative consequences on the relationship with their customers. Moreover, in fields where competition is fierce, the risk of the client choosing a competitor is always around the corner.

However, punitive discipline may not be strict enough to have the desired deterrent effects. Of course, in the end it is the consumer who is at a disadvantage by not receiving adequate protection.

It is therefore reasonable to inquire whether there may be other punitive methods other than financial ones that can more adequately protect the consumer’s interests.

For example, consideration may given to the possibility of introducing different restrictive measures (which would have a more practical scope and be proportionate to the market-share held by the company) against businesses responsible for implementing unfair commercial practices.

[1] Unfair practices are defined as deceptive when they represent elements and/or features of a product that do not correspond to the truth.
[2] Unfair practices are defined as aggressive when they consist of harassment, coercion or other forms of undue psychological conditioning of consumers.


November 25 - International day for the elimination of violence against women

It was November 25, 1960 when the Mirabal sisters lost their lives in Santo Domingo under the Trujillo dictatorship. The memory of that tragic moment was only institutionalised in 1981, when the 25 November was recognised as a symbolic date for the fight against the women’s violence.

Today, exactly 60 years later, violence against women is also perpetrated through the use of technology and IT tools; it’s sufficient to consider the data published today by the Central Directorate of the Criminal Police according to which, only in Lombardy, about 718 cases of revenge porn crimes have been recorded during the last year.

Revenge porn is a criminal offence recently introduced (Law 69/2019 - art. 612-ter of the Criminal Code) which punishes anyone who, having received or otherwise acquired images or videos with sexually explicit contents, intended to remain private, disseminates them without the consent of the persons therein represented in order to harm them.

How to address this kind of behaviour?

There is no single answer, but there must certainly be an awareness and knowledge of two concepts, which are still (unfortunately) underestimated today, that of "privacy" and "confidentiality", as well as of the tools that the current legislation provides to protect our personal data from unlawful appropriation.

For example, do we pay enough attention when posting images online that portray us or other individuals? A few days ago, the infographic published by the Italian Data Protection Authority  (available at the following link) provided useful suggestions and advice on this matter.

Have we ever checked, before starting a conversation and/or chatting with a third party using an IT tool, whether the communication service provider has adopted appropriate security protocols for our conversation? To better understand this matter, you can consult the explanatory page provided for by WhatsApp at the following link.

Again, do we know which applications or devices - for example, in our houses or installed in the places we go often to (such as swimming pools, gyms, etc.) - are able to film and/or listen to us without us knowing? And in the latter case, do we know how to exercise our privacy rights?

After all, the digital world also opens up new horizons with regard to the protection of women against violence and in this context the protection of our personal data plays a very important role.

Eliminating violence in all of its forms it's the common goal; awareness and knowledge are crucial to this end.


Facial Recognition and Digital World: technology at the service of convenience?

How many of us unlock our smartphones, make an online payment, authorize the download of an app and/or access a web portal simply by bringing the mobile device closer to our face? How easily do we "tag" our friends in our pictures on the most well-known social networks? And again: how many and what advantages may be obtained from knowing the number of passers-by who stop, even if just for a moment, to look at a billboard?

Statistics show that facial recognition technology is at the service of a digital world that "runs" faster and faster and which forces us to keep up with the times. But at what price for the protection of our personal data?

1. Introduction

Social networks, e-commerce websites, online magazines, home banking and mobile apps: there are millions of digital services available online that we can use through the creation of personal accounts.

When creating profiles, the most widespread trend, especially among young people, is to rely on easy and intuitive passwords (such as date of birth or first name) which are not so secure from an IT point of view and often identical for all the services those people use[1].

In order to deal with these bad habits - which only feed the already high number of data breaches – it has now become common to use so-called "facial recognition" technology (in Italian, "riconoscimento facciale"): this is a type of IT process that associates the features of a person's face with a digital image and stores that image in an electronic device for the purpose of reusing it not only as a means of identification but also for the authentication, verification and/or profiling of individuals.

But is it really always safe to rely on facial recognition? Does a biometric system always guarantee sufficient protection of our personal data?

2. The most frequent uses of facial recognition technology

It’s well known that different biometric techniques lend themselves to being used mainly in the IT context (for example, for authenticating access to a device) and the trend of the main high-tech companies is to invest ever greater amounts of money in this field.

However, facial recognition is also used outside the digital world: take for example the use of biometric systems for the control of physical access to reserved areas, for the opening of gates or for the use of dangerous devices and machinery.

But that's not all. Facial recognition techniques are also capable of serving public authorities and even research. The police in New Delhi has in fact tested facial recognition to identify almost 3,000 missing children; some researchers have used it to detect a rare genetic disease found in subjects from Africa, Asia and Latin America[2].

Faced with such a large number of uses of facial recognition, it is worrying that in our country a specific national legislation on this matter has not yet been enacted. Indeed, agreeing to the detection and collection of the features of our face by a data controller means sharing with the latter a wide range of personal data and exposing ourselves to the processing that the controller decides to make of such data.

Think about a simple "selfie" made with our smartphone: in these cases our device collects our personal image and stores it in a memory. Or again think about passing in front of billboards that detect our presence, the measurement of our body temperature using video and digital thermometers or the boarding systems with video-recognition installed in the largest airports of the world.

3. A quick vademecum for the processing of biometric data

The biometric characteristics of a face that allow for the unique identification of a natural person fall within the notion of "biometric data" provided by European Regulation no. 679/2016 ("GDPR")[3]. In fact, biometric data is defined by the GDPR as data "resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person"[4]. This means that an image / a photograph is not always qualifiable as biometric data if it is not processed through specific technical means that allow for the unique identification or authentication of a natural person[5].

Biometric data also fall within the category of "special categories of personal data" pursuant to art. 9 of GDPR (referred to by art. 2-septies of Legislative Decree no. 196/2003 - "Privacy Code") and can be processed only when the data controller complies with certain legal obligations. Let's try to list some of these obligations here below:

A. Compliance with the fundamental principles of the processing. In an increasingly digital world, the principles of "privacy by design" (data protection by design) and "privacy by default" (data protection by default) provided for by art. 25 GDPR play a leading role[6]. In order to comply with these principles, starting from the design and definition phases of the processing tools the data controllers who use facial recognition for the processing of personal data must provide adequate security measures to ensure the protection of fundamental rights and freedoms of individuals as well as the compliance with the principles set out in Article 5 of GDPR.

Specifically, attention should be paid to the principle of "data minimization" which requires the data controller to configure a biometric recognition system in order to collect and process only a limited number of information, excluding the acquisition of additional data that is not necessary for the purpose to be achieved in the specific case (for example, if the purpose of the processing is that of computer authentication, biometric data should not be processed in such a way as to infer any information of a sensitive nature belonging to the data subject including, for example, clearly visible skin diseases).

B. Information notice. The data controller must provide the data subjects with a privacy notice in accordance with art. 13 of GDPR, which, in a clear and transparent manner, indicates the purposes of the processing, the security measures that have been adopted, the possible centralization of the biometric data that has been collected, the storage periods of the personal data. In this regard, it is appropriate to point out that, as clarified by the Italian data protection Authority[7], such privacy notice has to be delivered before the so-called "enrolment" phase which take place before the creation of a biometric sample[8].

C. Legal basis of the processing. The data controller must ask for the prior consent of the data subjects in order to process their biometric data, or alternatively the data controller should assess the possibility of relying on another legal basis under Article 9 of the GDPR (including, for example, the existence of reasons of public interest in the area of public health, such as the protection against serious cross-border threats to health).

D. DPIA. As provided for by art. 35 of the GDPR and Annex 1 to Provision no. 467/2018 of the Italian data protection Authory, the data controller must assess the impact of the processing of biometric data and specifically assess the risks that such processing may entail for the rights and freedoms of individuals and, at the same time, identify the security measures adopted and to be adopted to address these risks.

E. Appointment of the data processor. Where the data controller engages a third party for the processing of biometric data, the latter must be appointed as "data processor" pursuant to art. 28 of GDPR, following the verification of the third-party's possession of suitable guarantees for the protection of the rights of the data subjects whose biometric data is processed.

F. The implementation of alternative systems. The data controller must offer alternative solutions that do not involve the processing of biometric data, without imposing restrictions or additional costs to the data subject. Such alternative solutions are necessary especially for those who are not able to comply with the constraints imposed by a biometric system (think about a disabled person who is not able to reach, with his face, the height of a thermoscanner) and in case such device is unavailable due to technical problems (for example, in case of malfunction).

4. Conclusions

The applicable data protection regulations are not and should never be considered as an obstacle to the development of new technologies applied to the IT and digital industry. On the contrary, compliance with existing legislation should be an incentive for creating practical solutions in a way that respects the confidentiality of our information.

This should also be the case for facial recognition technology, in relation to which it is important to make users aware of the security of the processing of their personal data. Also because generating awareness means gaining trust from consumers, which is the first step for a correct marketing strategy.

Just as Apple has done with the recent update to "iOS 14" which allows the owners of the latest mobile devices to know - through different color indicators (green and orange) that appear on the status bar of the device - if an installed app is using the camera and then detecting the user's image.

On the other hand, the protection of our personal data must never be sacrificed. And to do this, in our opinion, it is essential that our country enact regulations governing this technology. The added values that facial recognition is able to provide to our economy are in fact under the eyes of everyone for a long time, but if we do not act at the regulatory level in the short term the risk is to have to face in a few years the development and uncontrolled use of these technical solutions, with the consequence of having to spend time and economic resources to solve multiple problems rather than bringing about new advantages.

 

[1] This is confirmed by an interesting (and worrying, for all of us) study that was published during the “Safer Internet Day”, according to which more than half of Italian millennials (55%) uses the same password to access different services and 19% uses extremely simple passwords such as a numbered sequence.

[2] Also noteworthy is the new project "Telefi" funded by the European Commission and called "Towards the European Level Exchange of Facial Images" (TELEFI). It is a study on the benefits that the use of facial recognition can provide to crime investigation in EU Member States and the exchange of data collected within the "Prüm" system, through which DNA, fingerprints and vehicle registration data are exchanged between EU countries to combat cross-border crime, terrorism and illegal migration.

[3] Classic examples of biometric data, in addition to the characteristics of the face, are: the fingerprints, handwritten signature placement dynamics, the retinal vein pattern, the iris shape, the characteristics of the voice emission.

[4] See, for more details, the Opinion of the Working Party ex art. 29 (now replaced by the “European Data Protection Board”) no. 2/2012 - https://www.pdpjournals.com/docs/87997.pdf.

[5] See Recital no. 51 GDPR.

[6] See “Guidelines no. 4/2019 on Article 25 Data Protection by Design and by Default” - Version 2.0 Adopted on 20 October 2020.

[7] See on this matter “Guidelines on biometric recognition and graphometric signature” issued by the Italian data protection Authority on 12 November 2014.

[8] With the term "enrolment" it is understood the process through which a subject is accredited to the biometric system, through the acquisition of one of its biometric characteristic. Indeed, to enable biometric recognition is necessary to acquire the biometric characteristic by way of a procedure ensuring that biometric enrolment is performed appropriately, that the link with the capture subject is retained, and that the quality of the resulting biometric sample is safeguarded. Generally, the facial biometric sample is used to extract, via algorithms that are sometimes based on so-called “neural networks”, a given set of features such as the location of eyes, nose, nostrils, chin and ears in order to build up a biometric template.