The “Magic Avatar” and the world of artificial intelligence: lights and shadows of a trend that “revolutionizes" privacy and copyright

On December 7, 2022, “Lensa” has turned out to be the most popular iPhone app on the Apple store. The reason? Although “Lensa” has been on the market since 2018, last November it launched a new feature called “Magic Avatar”: taking advantage of artificial intelligence, this feature allows users – upon payment of a fee - to transform their selfies into virtual avatars.

At first glance, once does not catch the problem arising from an avatar who shows the face (clearly enhanced) of the subject of the selfie; however, upon closer analysis, there are several legal issues connected to the use of this “Lensa”’s feature.

Indeed, the application thereof works thanks to artificial intelligence and on the basis of a huge amount of data (so-called “datasets”) which are stored and used to improve the performance of the application itself. In most of the cases, these datasets are nothing more than images collected randomly on the web in relation to which obviously there is no real control on the existence of any rights. This is the first problem: the diffusion and collection of illustrations without the consent of the artists who previously created them turn out to be a copyright’s infringement. Authors are not recognized of any contribution or prize for their works – which, instead, should be guaranteed to them pursuant to the Copyright Italian Law (l. 633/1941 and subsequent amendments) – and they also find themselves competing with artificial systems which are able to “emulate” their style in few minutes.

The problem does not concern the avatar generated by “Lensa” application, but the huge number of images extrapolated from the web, used by the system to trains itself and from which it must “learn” to then reproduce the avatar. The consequences of such a trend should not be underestimated since it is fair to wonder whether one day the artificial intelligences might completely replace human activity. Such undesirable scenario is not so unlikely if we consider that the treatment of visual works created by the use of artificial intelligence’ systems is currently being studied by the US Copyright Office.

In order to (partially) face this issue, the website “Have I Been Trained” has been created to help content creators carry out research aimed at understanding whether the datasets used by artificial intelligences unlawfully reproduce their creations.

There is also a second and more worrying aspect concerning the processing of personal data by “Lensa”. Upon payment of a very low amount to generate the avatar, people provide the application with personal data and information that may also be used for purposes completely different from the creation of “filtered images” and that have therefore a significant economic value. This is one of the main complaints made against this app, namely that, once installed, “Lensa” collects more data that those necessary for its operation, transferring them to servers located in the USA (where the company’s registered office is located). That’s enough to state that the data processing does not comply with the GDPR.

Indeed, Lensa app’s privacy policy states that users’ biometric data (defined under Art. 4 par. 14 of GDPR as “those personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”) would be delated from the servers once the app has used them to generate the Magic Avatar.

The point is that – as often happens – “Lensa”’s privacy policy is long and complex, namely it adopts legal terms difficult to be understood by the users; for example, we read that “Lensa” does not use “facial data” for reasons other than the application of filters, unless the user gives consent to use the photos and videos for a different purpose. This might seem comforting but, on a deeper analysis of the terms and conditions, it turns out that “Lensa” reserves far broader powers – of distribution, use, reproduction, creation – over the work raised from user content, subject to an additional “explicit consent” required by the applicable law (i.e., the various national laws).

But where does such “explicit consent” come from? Easy: by sharing the avatar on the public or tagging “Lensa” on social networks, even via hashtag, the user gives consent to use that content and thus authorizes the company to reproduce, distribute and modify it. This licence – which ends with the deletion of the account – is justified in Lensa’s privacy policy on the basis of the so-called “legitimate interest” (i.e. “it is our legitimate interest to make analytics of our audience as it helps us understand our business metrics and improve our products”).

However, this statement raises some concerns, especially in the light of the decision issued by the Italian Privacy Guarantor concerning the social media “Clubhouse”, according to which company’s “legitimate interest” is not the proper legal basis for processing such data and therefore it is not correct either for carrying out data analysis or for the systems “training” process.

In the end, artificial intelligence undoubtedly represents an epoch-making technological evolution, but its use may imply the risk of un unlawful compression of users’ rights; indeed, a European Regulation on artificial intelligence aimed at defining the scope and conditions of its use has been under consideration for some time.

In this respect, hopefully “Lensa” application will take steps as soon as possible to protect the illustration creator’s rights through the recognition of a proper remuneration to them, and so that the user data will be collected and processed correctly, in accordance with applicable privacy’s laws.


Drones and Privacy: an imperfect match

While until a few years ago the use of drones was the prerogative of video editors and the military sector, nowadays, thanks to technological evolution and lower costs, these small, compact-sized aircrafts increasingly represent recreational devices through which evocative landscape shots can be taken.

Precisely because of their ability to show the world as it has never been seen before, from an original and unusual perspective, the unmistakable buzzing sound perceptible dozens of meters away is also beginning to be heard in cities, beaches, or simply at organized events.

Any drone includes at least a GPS and a camera although the configuration can become more elaborate if required; in fact, more advanced drones also include cameras with night vision, 3D scanners, thermal imaging cameras, WiFi and Bluetooth devices, and so on.

So, one can wonder to what extent is the use of such devices lawful? The answer is not obvious especially when one considers that a drone is equipped not only with a camera, but also with internal memory capable of collecting and storing data and information about individuals in the area flown over.

The nature of such machines and the advanced technologies with which they are equipped make them inherently suitable tools for capturing "sensitive" data. It is clear that careless use of the drone, even if only for recreational purposes, could come into conflict with the right to confidentiality of the persons filmed and its privacy.

To answer this question, we must first look at European Regulation 2016/679, also known by its acronym GDPR.

Not all droners are experts in privacy so it could happen that those who decide to use one, even in a moment of recreational entertainment and fun among friends, are unaware that they have to apply certain rules and good practices to avoid a violation of privacy regulations, but not only that. In fact, it is good to keep in mind that careless use of the same could have civil as well as criminal implications.

Therefore, two different issues come into play in this regard: that of the privacy of the people filmed (in terms of data acquisition) and that of the protection of personal data (in terms of subsequent use).

First of all, as is well known, it is necessary to know that the right to privacy of third parties cannot be violated by filming from above private residences or places closed to the public. Such an infringement, in fact, will entitle the injured party to compel the video maker to destroy the collected images, preventing him from taking further videos, without prejudice to the right to take legal action to obtain compensation for any damages suffered (Art. 10 of the Civil Code).

The issue becomes more delicate when we not only make videos, but also decide to disclose the footage and images now in our possession by posting them, for example, on our social networks or on the Internet. In such cases, it is essential to take all the measures imposed by the GDPR in order to minimize the risk of incurring the heavy penalties that the Italian Data Protection Authority might impose.

First of all, the Italian Data Protection Authority emphasizes that when flying a drone equipped with a camera in a public place such as parks, streets or beaches, one should avoid invading people's personal spaces and intimacy or, in any case, avoid capturing images containing personal data such as license plates of cars or residential addresses.

Not only that. In fact, If the decision to disclose the footage is made, as a first step, it is essential to collect consent to the publication of the images from the subjects involved, which is the legal basis for making their distribution lawful (Art. 6 GDPR). Such consent is not required only if, due to the distance of the filming, the faces of the subjects are not recognizable or, in any case, the same are obscured.

The GDPR also considers lawful the filming necessary for the execution of the contract concluded with the person that purchases a product delivered to his or her home by the seller by means of a drone.

The pilots, moreover, should always observe the principles on data processing set forth in Article 5 of the GDPR, which requires compliance with the adequacy, relevance and non-excessiveness of data with regard to the purposes for which they were captured. The Droner, therefore, in compliance with the aforementioned principles should favor proportionate technology and prefer anonymization techniques that, through blurring or other graphical effects, allow images to be automatically obscured in order to avoid identification of individuals where it is not necessary.

For more shrewd pilots, it is emphasized that it is extremely risky to justify the collection of sensitive data by invoking the non-applicability of the GDPR to the processing of personal data “carried out by a natural person for the exercise of activities of an exclusively personal or domestic nature" (Art. 3 GDPR). Indeed, the judges of the Court of Justice of the European Union interpret this rule narrowly and, therefore, as a general rule this article does not constitute an exemption from the GDPR (Judgment of the Fourth Chamber, December 11, 2014, František Ryneš v. Úřad pro ochranu osobních údajů).

Finally, it must not be forgotten the criminal aspect into which the pilot who decides to make more brazen use of the drone by procuring footage, visual or sound, containing data pertaining to the daily routines of people taking place in private residences may fall. The risk is to be punished with imprisonment from six months to four years (Articles 614 and 615 bis of the Criminal Code).

Even in this case, the law is more severe when the filming is unlawful, referring to those taken without the consent of the person filmed. Thus, once again emerges the importance of the acquisition of consent, which could be for the Droner the only exemption to avoid a certain conviction, together with objective reasons of a higher order that justify such filming (e.g., public order requirements).  

In conclusion, it can be highlighted that privacy protection must be carefully evaluated because of the enormous technological potential of drones and the underlying economic interests. It is easy to speculate that the increasing use of drones in activities with high social impact will make the issue of protecting people's privacy increasingly prominent. Common sense and precaution, after all, remain the best principles in the responsible use of new technologies. It would be sufficient to refer to them to resolve many doubts and disputes.


The Italian Data Protection Authority and the 2021 inspections activity: biometric data, video-surveillance, food delivery and data breach will be in the spotlight between January and June

The Italian Data Protection Authority (DPA) has defined the boundaries of the inspection activity planned for the first six months of 2021. These will include n. 50 inspections to be conducted also by the Italian Finance Police (under delegation by the DPA) and will focus on the verification of compliance with the applicable privacy laws relating to the following matters of general interest:

  1. processing of biometric data for facial recognition also through video surveillance systems;
  2. processing of personal data in the context of the so-called "domestic video surveillance" sector and in the sector of audio/video systems applied to games (so-called connected toys);
  3. processing of personal data carried out by "data brokers";
  4. processing of personal data carried out by companies operating in the "Food Delivery" sector;
  5. data breach.

From this list two big developments emerge: in particular, this year the Italian DPA will extend its inspections also to the processing of biometric data, as well as to the processing carried out through video surveillance systems. These are two areas governed not only by the GDPR and the Privacy Code but also by various guidelines and other legal provisions, as well as by extensive case law.

Let us mention, just for example, the Guidelines of the Italian DPA on biometric recognition and graphometric signature of 2014, the renewed Article 4 of Law no. 300/1970 and Administrative Memo no. 5/2018 issued by the National Labour Inspectorate, the decision of the Italian DPA on video surveillance of 2010 and the recent FAQ on video surveillance of 5 December 2020, the national and EU case law concerning the monitoring of workers and the so-called "defensive controls", Opinion no. 2/2017 of the former Working Party art. 29 ("Opinion 2/2017 on data processing at work") as well as Guidelines no. 3/2019 of the European Data Protection Board (EDPB) on processing of personal data through video devices.

The above considerations lead us to think about the correct and complex task of identifying the privacy requirements to be met by data controllers and processors - i.e. the economic operators; indeed, especially before embarking on an activity involving the processing of biometric data or the use of video surveillance systems, it is necessary to clarify the particular circumstances of the case at issue (identifying the purposes of the processing, the security measures to be adopted, the possible involvement of any third-party providers, etc.) in order to correctly prepare the privacy documents required by the many applicable regulations (possibly with the help of specialized professionals).

Therefore, it will be interesting to analyse the results of the inspection activity of the Italian DPA to understand what will be - three years after the enactment of the GDPR - the level of compliance that the Authority will consider “acceptable” and what is the real level of compliance reached by the companies operating in our country who process special categories of personal data and use video surveillance systems.

Of course, the privacy obligations relating to the processing of biometric data or through video surveillance systems are on top of those generally required for the processing of personal data; consequently, in order to achieve full compliance with the privacy regulations in force, it is necessary not only to regulate particular areas of business activity (such as, for example, video surveillance or biometrics) but also to adopt (or rather, already have adopted) a solid internal privacy structure which - in case of inspections - can prove to the authorities that the processing of personal data carried out fully complies with the relevant legal provisions.

With particular reference to video surveillance, we would like to remind you that our Firm has developed and published on its website the quick and useful Guidelines for the installation of video surveillance systems, updated with the latest Italian and European regulations. You can consult the Guidelines here.


November 25 - International day for the elimination of violence against women

It was November 25, 1960 when the Mirabal sisters lost their lives in Santo Domingo under the Trujillo dictatorship. The memory of that tragic moment was only institutionalised in 1981, when the 25 November was recognised as a symbolic date for the fight against the women’s violence.

Today, exactly 60 years later, violence against women is also perpetrated through the use of technology and IT tools; it’s sufficient to consider the data published today by the Central Directorate of the Criminal Police according to which, only in Lombardy, about 718 cases of revenge porn crimes have been recorded during the last year.

Revenge porn is a criminal offence recently introduced (Law 69/2019 - art. 612-ter of the Criminal Code) which punishes anyone who, having received or otherwise acquired images or videos with sexually explicit contents, intended to remain private, disseminates them without the consent of the persons therein represented in order to harm them.

How to address this kind of behaviour?

There is no single answer, but there must certainly be an awareness and knowledge of two concepts, which are still (unfortunately) underestimated today, that of "privacy" and "confidentiality", as well as of the tools that the current legislation provides to protect our personal data from unlawful appropriation.

For example, do we pay enough attention when posting images online that portray us or other individuals? A few days ago, the infographic published by the Italian Data Protection Authority  (available at the following link) provided useful suggestions and advice on this matter.

Have we ever checked, before starting a conversation and/or chatting with a third party using an IT tool, whether the communication service provider has adopted appropriate security protocols for our conversation? To better understand this matter, you can consult the explanatory page provided for by WhatsApp at the following link.

Again, do we know which applications or devices - for example, in our houses or installed in the places we go often to (such as swimming pools, gyms, etc.) - are able to film and/or listen to us without us knowing? And in the latter case, do we know how to exercise our privacy rights?

After all, the digital world also opens up new horizons with regard to the protection of women against violence and in this context the protection of our personal data plays a very important role.

Eliminating violence in all of its forms it's the common goal; awareness and knowledge are crucial to this end.


Facial Recognition and Digital World: technology at the service of convenience?

How many of us unlock our smartphones, make an online payment, authorize the download of an app and/or access a web portal simply by bringing the mobile device closer to our face? How easily do we "tag" our friends in our pictures on the most well-known social networks? And again: how many and what advantages may be obtained from knowing the number of passers-by who stop, even if just for a moment, to look at a billboard?

Statistics show that facial recognition technology is at the service of a digital world that "runs" faster and faster and which forces us to keep up with the times. But at what price for the protection of our personal data?

1. Introduction

Social networks, e-commerce websites, online magazines, home banking and mobile apps: there are millions of digital services available online that we can use through the creation of personal accounts.

When creating profiles, the most widespread trend, especially among young people, is to rely on easy and intuitive passwords (such as date of birth or first name) which are not so secure from an IT point of view and often identical for all the services those people use[1].

In order to deal with these bad habits - which only feed the already high number of data breaches – it has now become common to use so-called "facial recognition" technology (in Italian, "riconoscimento facciale"): this is a type of IT process that associates the features of a person's face with a digital image and stores that image in an electronic device for the purpose of reusing it not only as a means of identification but also for the authentication, verification and/or profiling of individuals.

But is it really always safe to rely on facial recognition? Does a biometric system always guarantee sufficient protection of our personal data?

2. The most frequent uses of facial recognition technology

It’s well known that different biometric techniques lend themselves to being used mainly in the IT context (for example, for authenticating access to a device) and the trend of the main high-tech companies is to invest ever greater amounts of money in this field.

However, facial recognition is also used outside the digital world: take for example the use of biometric systems for the control of physical access to reserved areas, for the opening of gates or for the use of dangerous devices and machinery.

But that's not all. Facial recognition techniques are also capable of serving public authorities and even research. The police in New Delhi has in fact tested facial recognition to identify almost 3,000 missing children; some researchers have used it to detect a rare genetic disease found in subjects from Africa, Asia and Latin America[2].

Faced with such a large number of uses of facial recognition, it is worrying that in our country a specific national legislation on this matter has not yet been enacted. Indeed, agreeing to the detection and collection of the features of our face by a data controller means sharing with the latter a wide range of personal data and exposing ourselves to the processing that the controller decides to make of such data.

Think about a simple "selfie" made with our smartphone: in these cases our device collects our personal image and stores it in a memory. Or again think about passing in front of billboards that detect our presence, the measurement of our body temperature using video and digital thermometers or the boarding systems with video-recognition installed in the largest airports of the world.

3. A quick vademecum for the processing of biometric data

The biometric characteristics of a face that allow for the unique identification of a natural person fall within the notion of "biometric data" provided by European Regulation no. 679/2016 ("GDPR")[3]. In fact, biometric data is defined by the GDPR as data "resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person"[4]. This means that an image / a photograph is not always qualifiable as biometric data if it is not processed through specific technical means that allow for the unique identification or authentication of a natural person[5].

Biometric data also fall within the category of "special categories of personal data" pursuant to art. 9 of GDPR (referred to by art. 2-septies of Legislative Decree no. 196/2003 - "Privacy Code") and can be processed only when the data controller complies with certain legal obligations. Let's try to list some of these obligations here below:

A. Compliance with the fundamental principles of the processing. In an increasingly digital world, the principles of "privacy by design" (data protection by design) and "privacy by default" (data protection by default) provided for by art. 25 GDPR play a leading role[6]. In order to comply with these principles, starting from the design and definition phases of the processing tools the data controllers who use facial recognition for the processing of personal data must provide adequate security measures to ensure the protection of fundamental rights and freedoms of individuals as well as the compliance with the principles set out in Article 5 of GDPR.

Specifically, attention should be paid to the principle of "data minimization" which requires the data controller to configure a biometric recognition system in order to collect and process only a limited number of information, excluding the acquisition of additional data that is not necessary for the purpose to be achieved in the specific case (for example, if the purpose of the processing is that of computer authentication, biometric data should not be processed in such a way as to infer any information of a sensitive nature belonging to the data subject including, for example, clearly visible skin diseases).

B. Information notice. The data controller must provide the data subjects with a privacy notice in accordance with art. 13 of GDPR, which, in a clear and transparent manner, indicates the purposes of the processing, the security measures that have been adopted, the possible centralization of the biometric data that has been collected, the storage periods of the personal data. In this regard, it is appropriate to point out that, as clarified by the Italian data protection Authority[7], such privacy notice has to be delivered before the so-called "enrolment" phase which take place before the creation of a biometric sample[8].

C. Legal basis of the processing. The data controller must ask for the prior consent of the data subjects in order to process their biometric data, or alternatively the data controller should assess the possibility of relying on another legal basis under Article 9 of the GDPR (including, for example, the existence of reasons of public interest in the area of public health, such as the protection against serious cross-border threats to health).

D. DPIA. As provided for by art. 35 of the GDPR and Annex 1 to Provision no. 467/2018 of the Italian data protection Authory, the data controller must assess the impact of the processing of biometric data and specifically assess the risks that such processing may entail for the rights and freedoms of individuals and, at the same time, identify the security measures adopted and to be adopted to address these risks.

E. Appointment of the data processor. Where the data controller engages a third party for the processing of biometric data, the latter must be appointed as "data processor" pursuant to art. 28 of GDPR, following the verification of the third-party's possession of suitable guarantees for the protection of the rights of the data subjects whose biometric data is processed.

F. The implementation of alternative systems. The data controller must offer alternative solutions that do not involve the processing of biometric data, without imposing restrictions or additional costs to the data subject. Such alternative solutions are necessary especially for those who are not able to comply with the constraints imposed by a biometric system (think about a disabled person who is not able to reach, with his face, the height of a thermoscanner) and in case such device is unavailable due to technical problems (for example, in case of malfunction).

4. Conclusions

The applicable data protection regulations are not and should never be considered as an obstacle to the development of new technologies applied to the IT and digital industry. On the contrary, compliance with existing legislation should be an incentive for creating practical solutions in a way that respects the confidentiality of our information.

This should also be the case for facial recognition technology, in relation to which it is important to make users aware of the security of the processing of their personal data. Also because generating awareness means gaining trust from consumers, which is the first step for a correct marketing strategy.

Just as Apple has done with the recent update to "iOS 14" which allows the owners of the latest mobile devices to know - through different color indicators (green and orange) that appear on the status bar of the device - if an installed app is using the camera and then detecting the user's image.

On the other hand, the protection of our personal data must never be sacrificed. And to do this, in our opinion, it is essential that our country enact regulations governing this technology. The added values that facial recognition is able to provide to our economy are in fact under the eyes of everyone for a long time, but if we do not act at the regulatory level in the short term the risk is to have to face in a few years the development and uncontrolled use of these technical solutions, with the consequence of having to spend time and economic resources to solve multiple problems rather than bringing about new advantages.

 

[1] This is confirmed by an interesting (and worrying, for all of us) study that was published during the “Safer Internet Day”, according to which more than half of Italian millennials (55%) uses the same password to access different services and 19% uses extremely simple passwords such as a numbered sequence.

[2] Also noteworthy is the new project "Telefi" funded by the European Commission and called "Towards the European Level Exchange of Facial Images" (TELEFI). It is a study on the benefits that the use of facial recognition can provide to crime investigation in EU Member States and the exchange of data collected within the "Prüm" system, through which DNA, fingerprints and vehicle registration data are exchanged between EU countries to combat cross-border crime, terrorism and illegal migration.

[3] Classic examples of biometric data, in addition to the characteristics of the face, are: the fingerprints, handwritten signature placement dynamics, the retinal vein pattern, the iris shape, the characteristics of the voice emission.

[4] See, for more details, the Opinion of the Working Party ex art. 29 (now replaced by the “European Data Protection Board”) no. 2/2012 - https://www.pdpjournals.com/docs/87997.pdf.

[5] See Recital no. 51 GDPR.

[6] See “Guidelines no. 4/2019 on Article 25 Data Protection by Design and by Default” - Version 2.0 Adopted on 20 October 2020.

[7] See on this matter “Guidelines on biometric recognition and graphometric signature” issued by the Italian data protection Authority on 12 November 2014.

[8] With the term "enrolment" it is understood the process through which a subject is accredited to the biometric system, through the acquisition of one of its biometric characteristic. Indeed, to enable biometric recognition is necessary to acquire the biometric characteristic by way of a procedure ensuring that biometric enrolment is performed appropriately, that the link with the capture subject is retained, and that the quality of the resulting biometric sample is safeguarded. Generally, the facial biometric sample is used to extract, via algorithms that are sometimes based on so-called “neural networks”, a given set of features such as the location of eyes, nose, nostrils, chin and ears in order to build up a biometric template.