Data protection in the collaborative economy – by Vassilis Hatzopoulos

Version française ICI

The role of personal data in the collaborative economy: data as a currency

Data are inherently significant for the very existence and function of collaborative platforms: t collect and process a great amount of data concerning the age, gender, residence, employment, professional qualifications, dietary or other preferences, health condition, medications, location, economic details at both sides of the two-sided market in order to perform better matches. Data is acquired, then analysed through the use of algorithms, then applied in order to fulfil the platforms’ matching function.

Data acquisition

Platforms acquire data directly by the users, who type and/or upload personal data, such as name, photo, ID, etc in order to register with the platform; or who login through an already existing social media account, thus connecting the dots in creating a complete digital profile (mandatory data). Then users continue to actively upload personal data (location, cultural preferences, financial status, medical conditions etc), which are necessary for the provision of the underlying service, the increase of trust among peers and the development of their ‘self-branding’ (if they act as suppliers – use data). Further, platforms ‘passively collect’ data through the users’ web browsers (eg IP address), cookies, or simply use statistics. Moreover, through their mobile apps, platforms require access to the user’s contacts, location, SMS, phone calls, photos, camera, etc, (technical data), which may even be accessed by platforms when the app is not being used. What is more, platforms can acquire individuals’ personal data from data brokers, through the sale and/or exchange of data between companies, or else through mergers and acquisitions. Last but not least, the combination and analysis of existing datasets either concerning one individual, or the combination of personal data of one user with that of others, can create new data.

Data analysis

Sophisticated and self-learning algorithms and other big analytics tools are able to process and yield conclusions on a large volume and variety of data at high speeds. Data mining by intelligent computer algorithms in existing databases can elucidate hidden patterns either automatically and independently or upon specific hypotheses and queries inserted. Hence, for example, recommendations for similar products or services on a platform, such as Airbnb’s ‘For You’ section, are the result of data mining of previous preferences and analysis of patterns in the commercial behaviour of the user.

Data application

The information acquired and analysed is applied both for providing matches and for targeted advertising or promotion of sponsored content. Targeted personalised content may derive from a psychological profile put together and attributed to the user on the basis of the data collected, instead of simply his/her preferences and behaviour online. Further, collected personal data and/or ‘profiles’ may be sold to other companies and produce direct revenue for the collaborative platforms.

The aforementioned ‘data process’ shows that data act as a currency: they are given away as consideration for the services received by users (also known as ‘freemium’, see also Art 3(1) of the Commission’s draft Digital Content Directive), while constituting a ‘tradeable commodity’ for platforms.

Protection of personal data in the EU

Rather than as a currency, European law envisions – and protects – personal data and the right to privacy as fundamental rights, ia, in Articles 7 and 8 of the EU Charter of Fundamental Rights, Article 16(1) TFEU, as well as Article 8 ECHR and the Convention 108 of the Council of Europe. The EU General Data Protection Regulation (GDPR)[1] together with the new ‘e-Privacy Regulation[2] replacing Directive 2002/58/EC, shall enter into force in May 2018 and shall bring with it strict data protection rules and tough sanctions. However, few (if any) of the rules contained therein seem to take into account the role played by data in the collaborative economy and the way platforms operate:

a) consent given to platforms is rarely as explicit and specific as the rules require it to be, while it is unclear whether it is always prior, since much of the information (such as geolocation, IP address etc) is available to the platform before any consent is given; nor is consent free it constitutes the condition for access to (dominant) applications or where there is a clear imbalance between the parties (such as eg between a platform and its prosumers); and of course, consent is never really informed, since users cannot possibly imagine all the ways in which their data are used by the platforms.

b) sensitive data: collaborative platforms process data similar to medical records, including illnesses, disabilities, medications, test results, medical history, allergies, mental health information, drug use and so on. Additionally, the combination of individual health or non-health data can create a detailed medical profile, while even the most innocuous, non-health data, when collected over a significant period of time, combined and analysed, can reveal health related information and thus can qualify as health data. Other data, such as the itinerary often followed by Uber customers may be revealing of their sexual preferences and lives, as the notorious Uber’s ‘Rides of Glory’ scandal has shown.[3] Hence, it needs be ascertained on a platform-by-platform basis, depending on the algorithms used and the use made – which data shall be subject to the stricter rules of Article 9 GDPR.

c) the right to be forgotten: was recognised by the CJEU in Google Spain,[4] and is consolidated in the GDPR (Article 17). However, it is far from clear how/whether a user may require the erasure of personal data which has been given as consideration of a service already received and consumed, whether such right is ceded to the platform, and whether affected third parties (eg because their rating is being reduced) have any say concerning such an erasure.

d) profiling: data subjects have the right not to be subject to decisions based solely on automated processing, or else to ‘profiling’ (GDPR, Article 22). Considering the myriad of automated decisions taken by algorithms on the basis of personal data, such as scanning of messages’ content and flagging of suspicious activities, decision-making concerning the provider’s ‘job performance’ based on the reviews received, location data and matching of suppliers with users of eg driving services, automated payment or withholding of such payment, and even, ‘firing’ of a service provider, it is clear that most collaborative platforms plainly violate this obligation; unless they introduce time-consuming review mechanisms with human intervention on all phases of decision making.

 Protection of personal data outside the EU

Issues become even more complex for platforms based in third states, since the GDPR is supposed to have extra-territorial effects and such platforms may be required by the Commission to give access to their files; at the same time the free flow of data between such third states and the EU would require an “adequacy decision” being adopted by the Commission, stating that the level of protection offered in such states is substantially equivalent to that foreseen in the GDPR. Such adequacy decisions are closely monitored – and often annulled – by the Court (see eg. Case C-362/14 Maximillian Schrems v Data Protection Commissioner EU:C:2015:650 and the pending Case T-670/16 Digital Rights Ireland v Commission [2016] OJ C 410/26).

Non personal data

Combined with personal data, non-personal one may also be used against the interests of data subjects. Uber revealed that ‘desperate customers with smartphones on low battery are willing to pay even ten times more for a car ride than usual’.[5] Hence, the pricing policy could possibly be altered accordingly, thus influencing the consumer, even though information on battery retention is not considered as ‘personal data’ and thus is unprotected.

Arbitrage between data as a currency and data as a right: empowered users?

From the above it becomes clear that a friction exists between the protection of personal data and its use by collaborative platforms. The EU legislature and Court, so far, seem to privilege data protection over other conflicting rights. Such attitude, however, seems incompatible with the development of the collaborative economy based on massive data manipulation.

Information and empowerment of data subjects is key for the effective protection of their personal data. Platform users are already more active than the average consumer. It is only fit that they actively choose how their data will be processed once ‘trained’ to become more mindful about their control over their data. In line with the self-regulation culture enshrined in the collaborative economy, platforms could give data subjects the option of data processing every step of the way, through pop-up boxes and comprehensible questions, instead of lengthy, all-inclusive Terms and Conditions. Further, platforms’ privacy notices could be supported by ‘kite-marks’, ie visual symbols explaining the effects of their decisions.[6] ‘[S]uch kite-marks should include a graded scale indicating levels of data protection, similar to the traffic light system used in labelling for food products’,[7] so that platform’ users know instantly the level of data protection provided by the platform. This could also enhance competition between platforms on the basis of the level of data protection offered to their potential users.


Next week: Collaborative economy and competition law

To read the previous posts:


Vassilis HATZOPOULOSis full Professor of EU Law and Policies at the Panteion University, Athens (Greece), visiting Professor at the College of Europe, Bruges (Belgium), honourary Asst. Professor at the University of Nottingham (UK), Attorney at law – member of the Athens Bar. A leading expert in EU law, he notably wrote the first reference book on collaborative economy, The Collaborative Economy and EU Law, Oxford, Hart, 2018






[1] Regulation 2016/679.

[2] COM/2017/010 final.

[3] Sex and Uber’s “Rides of Glory”: The company tracks your one-night stands – and much more’ The Oregonian/Oregon Live (20 November 2014), available at

[4] Case C-131/12 Google Spain (n 99).

[5] O Zezulka, ‘The Digital Footprint and Principles of Personality Protection in the European Union’ (2016) Prague Law Working Papers Series No 2016/III/2, 3, available at; see also B Carson, ‘You’re more likely to order a pricey Uber ride if your phone is about to die’ Business Insider (12 September 2016), available at

[6] This view is also suggested in the UK House of Lords Report paras 237-239, .

[7] Ibid, para 235.

Votre commentaire

Entrez vos coordonnées ci-dessous ou cliquez sur une icône pour vous connecter:


Vous commentez à l’aide de votre compte Déconnexion /  Changer )

Photo Facebook

Vous commentez à l’aide de votre compte Facebook. Déconnexion /  Changer )

Connexion à %s

Ce site utilise Akismet pour réduire les indésirables. En savoir plus sur la façon dont les données de vos commentaires sont traitées.