The “Magic Avatar” and the world of artificial intelligence: lights and shadows of a trend that “revolutionizes" privacy and copyright

On December 7, 2022, “Lensa” has turned out to be the most popular iPhone app on the Apple store. The reason? Although “Lensa” has been on the market since 2018, last November it launched a new feature called “Magic Avatar”: taking advantage of artificial intelligence, this feature allows users – upon payment of a fee - to transform their selfies into virtual avatars.

At first glance, once does not catch the problem arising from an avatar who shows the face (clearly enhanced) of the subject of the selfie; however, upon closer analysis, there are several legal issues connected to the use of this “Lensa”’s feature.

Indeed, the application thereof works thanks to artificial intelligence and on the basis of a huge amount of data (so-called “datasets”) which are stored and used to improve the performance of the application itself. In most of the cases, these datasets are nothing more than images collected randomly on the web in relation to which obviously there is no real control on the existence of any rights. This is the first problem: the diffusion and collection of illustrations without the consent of the artists who previously created them turn out to be a copyright’s infringement. Authors are not recognized of any contribution or prize for their works – which, instead, should be guaranteed to them pursuant to the Copyright Italian Law (l. 633/1941 and subsequent amendments) – and they also find themselves competing with artificial systems which are able to “emulate” their style in few minutes.

The problem does not concern the avatar generated by “Lensa” application, but the huge number of images extrapolated from the web, used by the system to trains itself and from which it must “learn” to then reproduce the avatar. The consequences of such a trend should not be underestimated since it is fair to wonder whether one day the artificial intelligences might completely replace human activity. Such undesirable scenario is not so unlikely if we consider that the treatment of visual works created by the use of artificial intelligence’ systems is currently being studied by the US Copyright Office.

In order to (partially) face this issue, the website “Have I Been Trained” has been created to help content creators carry out research aimed at understanding whether the datasets used by artificial intelligences unlawfully reproduce their creations.

There is also a second and more worrying aspect concerning the processing of personal data by “Lensa”. Upon payment of a very low amount to generate the avatar, people provide the application with personal data and information that may also be used for purposes completely different from the creation of “filtered images” and that have therefore a significant economic value. This is one of the main complaints made against this app, namely that, once installed, “Lensa” collects more data that those necessary for its operation, transferring them to servers located in the USA (where the company’s registered office is located). That’s enough to state that the data processing does not comply with the GDPR.

Indeed, Lensa app’s privacy policy states that users’ biometric data (defined under Art. 4 par. 14 of GDPR as “those personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”) would be delated from the servers once the app has used them to generate the Magic Avatar.

The point is that – as often happens – “Lensa”’s privacy policy is long and complex, namely it adopts legal terms difficult to be understood by the users; for example, we read that “Lensa” does not use “facial data” for reasons other than the application of filters, unless the user gives consent to use the photos and videos for a different purpose. This might seem comforting but, on a deeper analysis of the terms and conditions, it turns out that “Lensa” reserves far broader powers – of distribution, use, reproduction, creation – over the work raised from user content, subject to an additional “explicit consent” required by the applicable law (i.e., the various national laws).

But where does such “explicit consent” come from? Easy: by sharing the avatar on the public or tagging “Lensa” on social networks, even via hashtag, the user gives consent to use that content and thus authorizes the company to reproduce, distribute and modify it. This licence – which ends with the deletion of the account – is justified in Lensa’s privacy policy on the basis of the so-called “legitimate interest” (i.e. “it is our legitimate interest to make analytics of our audience as it helps us understand our business metrics and improve our products”).

However, this statement raises some concerns, especially in the light of the decision issued by the Italian Privacy Guarantor concerning the social media “Clubhouse”, according to which company’s “legitimate interest” is not the proper legal basis for processing such data and therefore it is not correct either for carrying out data analysis or for the systems “training” process.

In the end, artificial intelligence undoubtedly represents an epoch-making technological evolution, but its use may imply the risk of un unlawful compression of users’ rights; indeed, a European Regulation on artificial intelligence aimed at defining the scope and conditions of its use has been under consideration for some time.

In this respect, hopefully “Lensa” application will take steps as soon as possible to protect the illustration creator’s rights through the recognition of a proper remuneration to them, and so that the user data will be collected and processed correctly, in accordance with applicable privacy’s laws.


Drones and Privacy: an imperfect match

While until a few years ago the use of drones was the prerogative of video editors and the military sector, nowadays, thanks to technological evolution and lower costs, these small, compact-sized aircrafts increasingly represent recreational devices through which evocative landscape shots can be taken.

Precisely because of their ability to show the world as it has never been seen before, from an original and unusual perspective, the unmistakable buzzing sound perceptible dozens of meters away is also beginning to be heard in cities, beaches, or simply at organized events.

Any drone includes at least a GPS and a camera although the configuration can become more elaborate if required; in fact, more advanced drones also include cameras with night vision, 3D scanners, thermal imaging cameras, WiFi and Bluetooth devices, and so on.

So, one can wonder to what extent is the use of such devices lawful? The answer is not obvious especially when one considers that a drone is equipped not only with a camera, but also with internal memory capable of collecting and storing data and information about individuals in the area flown over.

The nature of such machines and the advanced technologies with which they are equipped make them inherently suitable tools for capturing "sensitive" data. It is clear that careless use of the drone, even if only for recreational purposes, could come into conflict with the right to confidentiality of the persons filmed and its privacy.

To answer this question, we must first look at European Regulation 2016/679, also known by its acronym GDPR.

Not all droners are experts in privacy so it could happen that those who decide to use one, even in a moment of recreational entertainment and fun among friends, are unaware that they have to apply certain rules and good practices to avoid a violation of privacy regulations, but not only that. In fact, it is good to keep in mind that careless use of the same could have civil as well as criminal implications.

Therefore, two different issues come into play in this regard: that of the privacy of the people filmed (in terms of data acquisition) and that of the protection of personal data (in terms of subsequent use).

First of all, as is well known, it is necessary to know that the right to privacy of third parties cannot be violated by filming from above private residences or places closed to the public. Such an infringement, in fact, will entitle the injured party to compel the video maker to destroy the collected images, preventing him from taking further videos, without prejudice to the right to take legal action to obtain compensation for any damages suffered (Art. 10 of the Civil Code).

The issue becomes more delicate when we not only make videos, but also decide to disclose the footage and images now in our possession by posting them, for example, on our social networks or on the Internet. In such cases, it is essential to take all the measures imposed by the GDPR in order to minimize the risk of incurring the heavy penalties that the Italian Data Protection Authority might impose.

First of all, the Italian Data Protection Authority emphasizes that when flying a drone equipped with a camera in a public place such as parks, streets or beaches, one should avoid invading people's personal spaces and intimacy or, in any case, avoid capturing images containing personal data such as license plates of cars or residential addresses.

Not only that. In fact, If the decision to disclose the footage is made, as a first step, it is essential to collect consent to the publication of the images from the subjects involved, which is the legal basis for making their distribution lawful (Art. 6 GDPR). Such consent is not required only if, due to the distance of the filming, the faces of the subjects are not recognizable or, in any case, the same are obscured.

The GDPR also considers lawful the filming necessary for the execution of the contract concluded with the person that purchases a product delivered to his or her home by the seller by means of a drone.

The pilots, moreover, should always observe the principles on data processing set forth in Article 5 of the GDPR, which requires compliance with the adequacy, relevance and non-excessiveness of data with regard to the purposes for which they were captured. The Droner, therefore, in compliance with the aforementioned principles should favor proportionate technology and prefer anonymization techniques that, through blurring or other graphical effects, allow images to be automatically obscured in order to avoid identification of individuals where it is not necessary.

For more shrewd pilots, it is emphasized that it is extremely risky to justify the collection of sensitive data by invoking the non-applicability of the GDPR to the processing of personal data “carried out by a natural person for the exercise of activities of an exclusively personal or domestic nature" (Art. 3 GDPR). Indeed, the judges of the Court of Justice of the European Union interpret this rule narrowly and, therefore, as a general rule this article does not constitute an exemption from the GDPR (Judgment of the Fourth Chamber, December 11, 2014, František Ryneš v. Úřad pro ochranu osobních údajů).

Finally, it must not be forgotten the criminal aspect into which the pilot who decides to make more brazen use of the drone by procuring footage, visual or sound, containing data pertaining to the daily routines of people taking place in private residences may fall. The risk is to be punished with imprisonment from six months to four years (Articles 614 and 615 bis of the Criminal Code).

Even in this case, the law is more severe when the filming is unlawful, referring to those taken without the consent of the person filmed. Thus, once again emerges the importance of the acquisition of consent, which could be for the Droner the only exemption to avoid a certain conviction, together with objective reasons of a higher order that justify such filming (e.g., public order requirements).  

In conclusion, it can be highlighted that privacy protection must be carefully evaluated because of the enormous technological potential of drones and the underlying economic interests. It is easy to speculate that the increasing use of drones in activities with high social impact will make the issue of protecting people's privacy increasingly prominent. Common sense and precaution, after all, remain the best principles in the responsible use of new technologies. It would be sufficient to refer to them to resolve many doubts and disputes.


Free transfer of personal data to Republic of Korea: a historic adequacy decision is coming

The European Data Protection Board (“EDPB”) has issued its opinion on the draft adequacy decision published by the European Commission on last 16 June 2021 (available here) concerning the transfer of personal data to the Republic of Korea.

This is a decision that, once in force, will allow the EU economic operators – such as, first of all, all the electronic communication service providers, cloud providers and multinational companies - to freely transfer personal data from Europe to the Republic of Korea without having to adopt either the appropriate safeguards (e.g., "Standard Contractual Clauses") or the additional measures (e.g., consent of data subjects) required by Chapter V of EU Regulation No. 679/2016 ("GDPR").

Indeed, pursuant to articles 44 et seq. of the GDPR, the transfers of personal data to countries outside the European Economic Area or to an international organization are allowed provided that the adequacy of the third country or organization is expressly recognized by a decision of the Commission.

We will now examine in detail the contents of the opinion issued by the EDPB.

Firstly, it was noted that the Republic of Korea's legal framework on the protection of personal data is substantially aligned with the European one, especially with regard to the main definitions provided for by law (“personal data”, “processing” and “data subject”), the requirements for a lawful data processing, the general principles and the security measures.

This has been possible not only thanks to the presence of an effective privacy law (i.e., the "Personal Information Protection Act" or "PIPA" which came into force in 2011) but also because of a series of "notifications" (included the “Notification no. 2021-1”) issued by the Korean Data Protection Authority (i.e., "Personal Information Protection Commissioner" or "PIPC") which explain and make easily understandable the provisions of PIPA.

Moreover, as noted by the EDPB, the Republic of Korea is part of a number of international agreements that guarantee the right to privacy (including the "International Covenant on Civil and Political Rights", the "Convention on the Rights of Persons with Disabilities" and the “ONU Convention on the Rights of the Child"), which confirms the attention that the Republic of Korea has paid to the protection of personal data for several years now.

The EDPB's analysis then focused on some key aspects of PIPA that slightly differ from the GDPR and therefore require more attention - such as, in particular, the absence of a general right to withdraw the consent provided by the data subjects, for example, for marketing activities.

According to the EDPB, although article 37 of PIPA grants data subjects the right to request the “suspension” of the processing of their personal data - a right that can be exercised also in case of direct marketing, as expressly clarified by Recital 79 of the EU Commission adequacy decision – the PIPA provides for the right to withdraw the consent only in two specific cases:

  1. in relation to the transfers of personal data carried out in the context of special corporate operations (such as mergers, acquisitions, etc.);
  2. with regard to the processing of personal data for marketing activities by providers of electronic communication services.

The EDPB therefore considered it necessary to draw the Commission's attention to the above-mentioned issues in order to analyze in detail the consequences that, in the light of the Korean legal framework, the absence of such a right might cause for data subjects and to clarify, in the adequacy decision, the actual scope of the above-mentioned right to request the “suspension” of the processing.

Secondly, the EDPB observed that, pursuant to article 58 of PIPA, a substantial part of PIPA - including Chapters III, IV and V, which respectively regulate the general principles for data processing, the security measures and the rights of data subjects - does not apply to several processing of personal data (including those necessary to meet urgent needs for the protection of public health and safety).

The EDPB also notes that the word “urgent” in the PIPA expresses an extremely broad concept that needs to be limited and contextualized, also with the help of practical examples, in order not to compromise the confidentiality of the data subjects’ personal data.

Moreover, the EDPB, in the light of the current emergency situation caused by the Covid-19 pandemic, drew the Commission's attention in relation to the need to ensure an adequate level of protection also for personal data transferred to the Republic of Korea for purposes related to public health protection.

This is because "sensitive" information relating to European citizens (for example, the vaccination status), should receive at least the same level of protection as granted under the GDPR once transferred to the Republic of Korea. In this regard, the EDPB therefore invited the Commission to closely monitor the application of the exemptions provided for in article 58 of PIPA.

Finally, the EDPB considered it appropriate to focus on the possibility for Korean public authorities to access the personal data of European citizens for national security purposes. In this respect, there is no specific obligation for Korean authorities to inform data subjects of the access to their personal data, especially when data subjects are not Korean citizens.

However, even in the absence of such obligation, the balance between the needs of protection of the national security and the protection of the fundamental rights of the data subjects can be found in the same Korean Law that protects the privacy of interpersonal communications (the "Communications Privacy Protection Act" - see also Recital 187 of the adequacy decision), according to which the access to the personal data of European citizens for purposes of national security can be made only if certain legal requirements are met (for example, in the case of communications between "foreign agencies, groups or citizens suspected of being involved in activities threatening national security").

The EDPB notes that, as a further guarantee of the confidentiality of communications accessed by the Korean authorities, the South Korean Constitution states essential data protection principles applicable to this specific matter.

In the light of the favorable opinion issued by the EDPB, it is certainly desirable, and likely, that the European Commission will adopt an adequacy decision in respect of the Republic of Korea.

In an increasingly data-driven global economy based on the economic value of personal data as well as on the sharing of personal data, such an adequacy decision would open the door to the liberalization of trade with the east, also from a privacy perspective.

This regulatory intervention, object of this article, was due and awaited and it certainly follows the "Free Trade Agreement" between the EU and South Korea in force since 2011, which has been able to exponentially increase bilateral trade between the two countries (in 2015 the trade value of transactions amounted around € 90 billion).

Our hope is that, as the years go by, the European Commission's adequacy assessments will cover more and more legal frameworks so that the international transfer of personal data can represent a real and concrete instrument for promoting the economy and innovation worldwide.


The Italian Data Protection Authority and the 2021 inspections activity: biometric data, video-surveillance, food delivery and data breach will be in the spotlight between January and June

The Italian Data Protection Authority (DPA) has defined the boundaries of the inspection activity planned for the first six months of 2021. These will include n. 50 inspections to be conducted also by the Italian Finance Police (under delegation by the DPA) and will focus on the verification of compliance with the applicable privacy laws relating to the following matters of general interest:

  1. processing of biometric data for facial recognition also through video surveillance systems;
  2. processing of personal data in the context of the so-called "domestic video surveillance" sector and in the sector of audio/video systems applied to games (so-called connected toys);
  3. processing of personal data carried out by "data brokers";
  4. processing of personal data carried out by companies operating in the "Food Delivery" sector;
  5. data breach.

From this list two big developments emerge: in particular, this year the Italian DPA will extend its inspections also to the processing of biometric data, as well as to the processing carried out through video surveillance systems. These are two areas governed not only by the GDPR and the Privacy Code but also by various guidelines and other legal provisions, as well as by extensive case law.

Let us mention, just for example, the Guidelines of the Italian DPA on biometric recognition and graphometric signature of 2014, the renewed Article 4 of Law no. 300/1970 and Administrative Memo no. 5/2018 issued by the National Labour Inspectorate, the decision of the Italian DPA on video surveillance of 2010 and the recent FAQ on video surveillance of 5 December 2020, the national and EU case law concerning the monitoring of workers and the so-called "defensive controls", Opinion no. 2/2017 of the former Working Party art. 29 ("Opinion 2/2017 on data processing at work") as well as Guidelines no. 3/2019 of the European Data Protection Board (EDPB) on processing of personal data through video devices.

The above considerations lead us to think about the correct and complex task of identifying the privacy requirements to be met by data controllers and processors - i.e. the economic operators; indeed, especially before embarking on an activity involving the processing of biometric data or the use of video surveillance systems, it is necessary to clarify the particular circumstances of the case at issue (identifying the purposes of the processing, the security measures to be adopted, the possible involvement of any third-party providers, etc.) in order to correctly prepare the privacy documents required by the many applicable regulations (possibly with the help of specialized professionals).

Therefore, it will be interesting to analyse the results of the inspection activity of the Italian DPA to understand what will be - three years after the enactment of the GDPR - the level of compliance that the Authority will consider “acceptable” and what is the real level of compliance reached by the companies operating in our country who process special categories of personal data and use video surveillance systems.

Of course, the privacy obligations relating to the processing of biometric data or through video surveillance systems are on top of those generally required for the processing of personal data; consequently, in order to achieve full compliance with the privacy regulations in force, it is necessary not only to regulate particular areas of business activity (such as, for example, video surveillance or biometrics) but also to adopt (or rather, already have adopted) a solid internal privacy structure which - in case of inspections - can prove to the authorities that the processing of personal data carried out fully complies with the relevant legal provisions.

With particular reference to video surveillance, we would like to remind you that our Firm has developed and published on its website the quick and useful Guidelines for the installation of video surveillance systems, updated with the latest Italian and European regulations. You can consult the Guidelines here.


Contact tracing and COVID-19: the GDPR as a balance between the protection of health and the privacy right

The use of contact tracing technology is necessary and essential in order to deal with the Covid-19 emergency and protect the public health of our country. However, mapping the movement of individuals can have serious consequences for the protection of our privacy. So how can the right balance be found between the two fundamental rights of health and privacy of each individual?

“Contact tracing” is the expression of the moment. It is a digital system used for tracing physical contact between individuals and in that sense it represents an important technological measure aimed at containing and preventing the spread of the Covid-19 virus in our country (and elsewhere).

This tracking system should be implemented via the application called “Immuni”, designed and developed by the Milan-based software house Bending Spoons, which will (probably) be launched in Italy by the end of May 2020.

However, the tracking of contacts between individuals and the consequent use of their common and sensitive personal data (including health data) for purposes related to the protection of public health also has an impact on their privacy.

While, on the one hand, the protection of health is a right guaranteed by the Italian Constitution – wherein “health” is understood both as a fundamental right of the individuals and an interest of the community, pursuant to article 32 of the Italian Constitution - on the other hand, the protection of personal data (or the privacy right) is a fundamental right expressly provided for by the Charter of Nice and recognized by the Italian Constitution.

Furthermore, we should ask ourselves what could be the practical consequences of using contact tracing technology in our daily lives; how the relationship between the right to health and the right to protection of personal data has been dealt with at a regulatory level; and whether it is possible to rely on the use of this new technology without fear of violation of our right to privacy.

1. What is "Immuni" and how does it work?

Immuni is an application that can be downloaded on each mobile device and that generates - for each device - a temporary, anonymous and variable identification code (ID) which interacts, via “Bluetooth Low Energy” technology, with other nearby mobile devices, thereby collecting and storing the ID code and related metadata of those devices (for example, how long the connection with the other device lasted, the distance in meters, etc.). The data controller of the personal data collected by Immuni is the Italian Ministry of Health.

Immuni checks whether the nearby ID codes include so-called “positive IDs”, i.e. ID codes associated with mobile devices owned by people who are already infected (or rather, whose infection has already been ascertained by a healthcare facility); to perform this operation, Immuni downloads positive IDs from a publicly managed server at regular intervals and cross-checks them with the ID of the device on which the application is installed. Subsequently, Immuni processes the metadata collected through a special algorithm and determines whether a “potential risk of Covid-19 contagion” (which may be more or less high) may be established.

If the outcome of the above checks is positive and Immuni believes that there is a reasonable risk of contagion, the user receives a notice on his / her device indicating that he / she was in contact with an infected person and therefore invites him / her to follow certain instructions (including, for example, staying at home and/or carrying out diagnostic tests).

A practical example on how this works: Carlo and Giulia meet for a few minutes at a short distance from each other; they both downloaded Immuni and their mobile devices have captured each other’s ID codes. After a few days, Carlo discovers that he has Coronavirus and decides spontaneously (remember that there is no obligation in this regard) to upload this sensitive data to Immuni. Meanwhile, the app installed on Giulia’s smartphone examines the IDs she has collected and stored in its memory and cross-checks them with those downloaded from the public server, detecting the presence of Carlo’s ID. Subsequently, Giulia is informed through a notice sent by her Immuni app that she was in contact with a person that tested positive to the Coronavirus (without, however, indicating who that person is) and she is thereby invited to take certain cautionary measures.

2. The opinions of the European Data Protection Board and the Italian Data Protection Authority

The European Data Protection Board (see the “Guidelines 04/2020 on the use of location data and contact tracing tools in the context of the COVID-19 outbreak”) and the Italian Data Protection Authority (see “Parere sulla proposta normativa per la previsione di una applicazione volta al tracciamento dei contagi da COVID-19”) have already issued their opinions on the use of geo-localisation of individuals and contact tracing tools during the Covid-19 emergency and have identified which measures should be taken in order to ensure that the data subjects’ personal data is processed without causing prejudice to their fundamental rights and freedoms.

In this regard, according to the Italian Data Protection Authority the contact tracing carried out through the Immuni app is in line with the criteria identified by the European Data Protection Board and is compliant with the data protection principles inasmuch as such contact tracing:

a) is regulated by a law that provides a sufficiently detailed description of the processing of personal data, the type of data collected, the guarantees given to the data subjects, the provisional duration of the measure (reference should be made to art. 6 of Italian Legislative Decree of 30 April 2020, no. 28);

b) is based on the voluntary participation of the data subject, excluding any form of conditioning of individual choice and, therefore, any possibile unequal treatment for those who decide not to consent to the tracking;

c) is designed to pursue a public interest purpose indicated with sufficient precision and excludes that the personal data collected is being processed for other different purposes, it being understood that there is the possibility (within the general terms provided for by the GDPR) of using the personal data, either anonymously or in aggregate form, for statistical or scientific research purposes;

d) appears to comply with the principles of minimisation as well as with the criteria of privacy by design and by default (set out in art. 25 of the GDPR), insofar as it provides for the collection of just the data that concerns the proximity or closeness of the devices and for their treatment in pseudonymous form, provided such collection may not occur in a completely anonymous form. Such collection must occur in such a way as to exclude the use of geo-localisation data and limit the storage of data to the time strictly necessary to reach the indicated purpose, with the automatic deletion at the expiry date.

In this regard, it should be noted that, pursuant to art. 6 of Italian Legislative Decree no. 28/2020, the use of Immuni and any related processing of personal data will have to cease at the end of the state of emergency and in any case within 31 December 2020, and all personal data processed must be permanently deleted or anonymized;

e) complies with the principle of transparency with respect to the data subjects, thus guaranteeing that before activating the app the users receive an information notice in accordance with the GDPR.

Consequently, the Italian Data Protection Authority supports the use of Immuni, however, it always maintains special attention towards the data subjects: indeed, in its opinion, the Authority clarifies that, on the one hand, the characteristics of the processing of personal data carried out by Immuni can be better identified and, on the other hand, adequate measures to protect the rights, freedoms and legitimate interests of the data subcjects can be adopted (in accordance with art. 2-quinquiesdecies of the Privacy Code and art. 36, para 5, of the GDPR).

3. GDPR as a balance between the protection of health and personal data

There is unfortunately a widely held opinion, especially among non-professionals, that privacy regulations are often a mere “bureaucratic complication” that act as an obstacle to the achievement of all those goals which involve a processing of personal data.

This is an erroneous and misleading opinion, often generated by a lack of knowledge about the law, which may also be dangerous in light of the consequences which it could lead to - let us just think, for example, about the remote possibility of completely forfeiting the use of a valid and efficient contact tracing system to deal with the health Covid-19 emergency inasmuch as such system allegedly may be “incompatible” with the protection of personal data.

This writer believes that the current scenario represents the perfect context to demonstrate that, on the contrary, privacy regulations may (and must) represent the “balance” which allows for the achievement of some of the most ambitious purposes – which certainly include the protection of public health - without giving up privacy.

First of all, it should be recalled that the European Data Protection Board had the opportunity to clarify that “the data and technologies used to contribute to the fight against COVID-19 must serve to give people more tools, rather than to control, stigmatise or repress their behavior”.

A careful analysis of Regulation no. 679/2016 (so-called “GDPR”) also shows that, with reference to health emergency situations, the European legislator does not place any obstacle in the way of pursuing important interests in the public health sector by means of the processing of personal data (see recitals 52 and 54 and art. 9, paragraph 1, letter g and i), where such interests also include “monitoring and alert purposes, the prevention or control of communicable diseases and other serious threats to health”.

On the same basis, the Italian Privacy Code - updated lastly by Italian Legislative Decree no. 101/2018 - also refers to the provisions of the GDPR and indeed expressly considers “relevant” the interest of those who process personal data for the performance of public interest tasks (or tasks related to the exercise of public authority) in the health sector and for the health and safety of the population.

Consequently, it may be said that data protection rules support the adoption of measures and solutions aimed at curbing the spread of the Covid-19 without, however, forfeiting protection of individual privacy.

However, these solutions - and this is where the “balance” can be found - must always be based on laws that regulate them and that expressly establish appropriate and specific measures to protect the fundamental rights and freedoms of the data subjects, including the types of data that can be processed, the processing operations that can be carried out and the reason of the relevant public interest (see art. 9, paragraph 1, letter i of the GDPR and art. 2-sexies of the Privacy Code).

4. Conclusions and considerations

So can we “trust” Immuni? The answer must be affirmative.

As already said, the use of this app is left to the conscientiousness of each of us, since the key principle that inspires this app is the principle of willingness: each user, in other words, will be free to download it, to enter their personal data in the app, even related to their state of health (i.e Covid-19 positivity) and to comply or not with the instructions received from the app following a potential contact with an infected person.

However, experts tell us that at least 70% of the population should download it so that the app can contribute to a significant containment of the pandemic. This is certainly an ambitious objective, which, in order to be achieved, requires first of all a public awareness campaign aimed at making the app easy to understand, especially for those who are not experts in the field, and clarifying what are the guarantees identified by the Italian Government to protect our privacy.

Briefly, the guarantees that Immuni offers for the protection of our personal data, and that everyone has the duty to be aware of, are: transparency towards the data subjects (we will know before registering with the app, for example, for which purposes our personal data will be processed, for how long and to whom it will be communicated), the exclusivity of the purpose of the processing (our data will be used only for the containment of infections, excluding different purposes) and the minimisation of processing (only the data necessary to trace our contacts will be collected and reliable anonymisation and pseudonymisation techniques will be adopted).

But that’s not all. Indeed, spreading accurate knowledge of data protection rules, their real meaning, their function and their value is even more important, if not indispensable and essential, towards contributing to a widespread “legal culture” and pursuing an ambitious goal that we are all called upon to realize in this delicate historical moment.

The President of the European Data Protection Board, Andrea Jelinek, has expressly reiterated this concept to the European Commission by stating: “the voluntary adoption of a contact tracing system is associated with individual trust, thus further illustrating the importance of data protection principles”.

Awareness and trust. And vice versa.