Three applicants v Ola Netherlands B.V. C/13/689705 / HA RK 20-258, District Court, Amsterdam (11 March 2021)
An Amsterdam Court has ordered Ola (a smartphone-hailing taxi organisation like Uber) to be more transparent about the data it uses as the basis for decisions on suspensions and wage penalties, in a ruling that breaks new ground on the rights of workers subject to algorithmic management.
James Farrarr and Yaseen Aslam, who won the landmark victory in the UK Supreme Court in February, led the action by a group of UK drivers and a Portuguese driver, who bought three separate cases against Ola and Uber seeking fuller access to their personal data.
The following is a summary of the case against Ola taxis. Anton Ekker (assisted by AI expert Jacob Turner, whom we interviewed on Law Pod UK here) represented the drivers. He said that this case was the first time, to his knowledge, that a court had found that workers were subject to automated decision-making (as defined in Article 22 of the GDPR) thus giving them the right to demand human intervention, express their point of view and appeal against the decision.
The Facts
Ola is a company whose parent company is based in Bangalore, India. Ola Cabs is a digital platform that pairs passengers and cab drivers through an app. The claimants are employed as ‘private hire drivers’ (“drivers”) in the United Kingdom. They use the services of Ola through the Ola Driver App and the passengers they transport rely on the Ola Cabs App.
Proceedings are pending in several countries between companies offering services through a digital platform and drivers over whether an employment relationship exists.
By separate requests dated 23 June 2020, the first two claimants requested Ola to disclose their personal data processed by Ola and make it available in a CSV file. The third claimant made an access request on 5 August 2020. Ola provided the claimants with a number of digital files and copies of documents in response to these requests.
Ola has a “Privacy Statement” in which it has included general information about data processing.
All references in this judgment is to the AVG, which is Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (GDPR).
The applicants requested an order from the court that they should be given access in a common electronic format to a range of data, including personal data, the recipients to whom that dat would be disclosed, and the existence of automated decision making, including profiling as referred to in Article 22(1) and (4) of the AVG (GDPR). They also asked for appropriate safeguards in the case of transfer to a third country or an international organisation, in accordance with Article 46 AVG.
The claimants also sought an order from the court in respect of Ola, within one month of notification of the decision, to provide them with their personal data in a structured, commonly used and machine-readable form, that is to say as a CSV file, in such a way that this data could be transmitted directly to another controller.
They asked the court to enforce the foregoing orders on pain of a penalty of €2,000 for each day or part of a day that Ola remains in default of complying and for Ola to pay the costs of the proceedings.
Arguments before the Court
The claimants maintained that Ola had not provided full access to their personal data in response to their access requests. Ola’s Privacy Statement and accompanying documents showed that the company processes a large number of categories of personal data, but the claimants were not able to obtain access to a large part of these categories. This, they claimed, was inadequate under GDPR.
Ola makes use of automated decision-making and profiling in the performance of the contract with their drivers. When applying profiling, recital 71 of the AVG (GDPR) requires Ola to implement appropriate procedures and take measures to ensure fair and transparent processing for the data subject. Discriminatory effects of profiling should also be avoided. In order to be able to assess whether Ola had complied with the requirements of Article 22(3) of the AVG when using it, the claimants maintained that they should have had access to automated decision-making and profiling, information about the underlying logic and the expected consequences of such processing. They pointed out that proceedings were being conducted in various countries about the question of whether an employment relationship exists between providers of ‘Ride Hailing apps’ and drivers.
Of importance here is the extent to which such providers have management control which they exercise through, inter alia, algorithms and automated decision-making.
Referring to the recent ruling by the UK Supreme Court that drivers are entitled to minimum wage and holiday allowance for each our that they are logged on to a “Ride Hailing platform”, the claimants argued that they needed access to their data in order to calculate these wages. This data, they contended, was necessary for drivers to “organise themselves and build collective bargaining power”. Transparency about data processing, they said, was necessary to protect the interests of drivers vis-à-vis platform providers; and when deciding on their licence to drive, drivers are assessed on the basis of their suitability, in which context their track record and conduct is relevant. Therefore, said the applicants, drivers have an interest in unrestricted access to their data.
Ola contended that the requests be rejected, or granted (in part) with due regard for the circumstances and guarantees referred to by Ola, and that [applicant 3] be ordered to pay the costs of the proceedings (including subsequent costs), plus statutory interest.
The Court’s conclusions
The Amsterdam District Court found that the car-booking app had used an entirely automated system to make deductions from one driver’s earnings. This is a finding that attracts greater legal protection under Dutch law.
The judge was at pains to stress that, in principle, a data subject does not have to give reasons or substantiate why he is making a request for inspection under the AVG (GDPR).
When exercising his right of inspection, the data subject does not have to put forward a particular interest or state the purpose he intends to achieve with the inspection. The mere fact that data relating to him or her is being processed is sufficient. It is up to the controller to demonstrate abuse of power.
The claimants argued that they wished to check the accuracy and lawfulness of their own data and that this was a condition for being able to exercise other privacy rights. That was sufficient to satisfy the court. Contrary to Ola’s submission, the fact that the claimants (and the union to which they were affiliated) also had another interest in obtaining personal data, namely to use them to obtain clarity about their employment status or to gather evidence in legal proceedings against platforms, did not mean that the applicants were abusing their rights under the GDPR. The claim of abuse of the right of inspection was therefore rejected.
[The relevant section of the AVG/GDPR] allows the data subject to move, copy or transfer personal data easily from one IT environment to another, without hindrance, and regardless of whether the data are held on their own systems, on the systems of trusted third parties or on the systems of new data controllers. Ola rightly argues that an important purpose of this right is to facilitate switching to another service provider and to avoid so-called ‘user lock-in’ with the original controller. However, this does not mean that the purpose pursued by the claimants – analysis of their own personal data or use for their own purposes – is excluded from the right to data portability. There is no support for this in the founding history of the AVG, the recitals to the AVG itself or the Guidelines. The claim of abuse of the right to data portability is therefore rejected.
Furthermore, the court said Ola should give drivers access to anonymised ratings on their performance, to personal data used to create their “fraud probability score” and to data used to create an earnings profile that influences work allocation.
Here we come to the most interesting part of the judgment, where the Court grappled with the question of access for information related to automated decision making. and profiling. The claimants requested access to the existence of automated decision-making and profiling on the basis of Article 15 (1) of the AVG. This article stipulates that the data subject has the right to obtain from the controller information about the existence of automated decision-making, including profiling, and, at least in such cases, useful information about the underlying logic as well as the importance and expected consequences of such processing for the data subject.
Under Article 12(1) of the AVG, the data controller must provide data subjects with concise, transparent, comprehensible and easily accessible information about the processing of their personal data. The AVG defines profiling as:
any form of automated processing of personal data which evaluates, on the basis of personal data, certain personal aspects relating to an individual, in particular with a view to analysing or predicting the individual’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.
A data subject must be informed of the existence of profiling and its consequences (recital 60 of the AVG)
Under Article 22 of the AVG, individuals have the right, subject to certain exceptions, not to be subject to a decision based solely on automated processing or profiling which produces legal effects concerning them or significantly affects them in some other way. A decision based solely on automated processing is one where there is no significant human intervention in the decision-making process. Recital 71 of the AVG mentions as examples of automated decision making the automatic refusal of a credit application submitted online or the processing of job applications via the internet without human intervention.
In the case of these claimants, the Court observed that there had certainly been profiling within the meaning of Article 4(4) of the AVG, because the professional performance of the driver was being evaluated. This means that Ola had to allow access to the personal data of the claimants which it used to draw up the profile and should also provide information about the segments into which the claimants were classified, so that they could check whether this information was correct.
The Court noted that Ola’s automated decision-making process which determines that journeys are not valid as a result of which penalties and deductions are imposed. It followed from Ola’s explanation of this decision-making process that there was no human intervention prior to such a decision. The [automated] decision to impose a reduction or fine affected the claimants’ rights under the agreement with Ola. This means that Ola was prohibited from subjecting the claimants to such decision-making unless it was necessary for the performance of the agreement between Ola and the claimants.
In conclusion, the Court ordered that Ola, within two months after service of this order, provided the claimants with with a copy or inspection of the (personal) data concerned.
Comment
Apart from being another chip in the wall erected by the “ride hailing” gig taxi industry, this judgment is a bellwether for the future approach of courts to “black box” automated decision making in processes relying on AI. Jacob Turner of Fountain Court Chambers tweeted (@Jacobturner1):
These judgments [including the one I’ve summarised above] are the first in the world on the right to an explanation of automated decision-making under the GDPR. Uber must disclose data about alleged fraudulent activities by the drivers, based on which Uber deactivated their accounts (‘robo-firing’) as well as data about individual ratings Ola Cabs must provide access to ‘fraud probability scores”, earning profiles, and data that was used in a surveillance system. In the case of one Ola driver, the court decided that a decision to make deductions rom driver earnings using an algorithm amounted to an automated decision lacking human intervention.
The Financial Times [paywall] reported the Uber ruling by the same court. According to the FT, Uber’s responded that this was a “crucial decision”.
The court has confirmed Uber’s dispatch system does not equate to automated decision-making, and that we provided drivers with the data they are entitled to. The court also confirmed that Uber’s processes have meaningful human involvement.”
However, not all of the ride hailing apps escaped the “automated decision making” category in these judgments and James Farrar, a representative of one of the drivers’ unions involved said “This is a hugely important first step … We’re going to have to do an awful lot more.”
Winning access to data was crucial, he said, because as platforms’ contractual arrangements with workers came under greater scrutiny, they were shifting towards more opaque automated management systems. Greater transparency would not only help drivers contest unfair decisions against them but would also help to ascertain their average hourly earnings after costs
The post The providers of ‘Ride Hailing apps’ and their drivers: another judgment from Amsterdam appeared first on UK Human Rights Blog.