Critical notes on ‘platformised’ education: untangling privacy and data protection in postpandemic universities
The widespread use of online technologies across European Higher Education Institutions (HEIs) is shaking the foundations of our education system.
The COVID-19 pandemic has triggered disruptive transformations in education practices, destined to outlive emergency times as consolidated digital teaching and learning. Exploiting new online means and interacting with new stakeholders (e.g. online platforms) has now become a common feature in education. This new reality carries opportunities, but also significant risks for the protection of fundamental rights and freedoms, structurally embedded in education.
The data protection implications resulting from the ‘platformisation’ of education are attracting the attention of all involved actors (HEIs, teachers, students, platforms) and, lately, of policymakers. The implications concerning the way personal data collected through these educational practices are governed, processed and shared, remains a largely unexplored issue. It is against this background that we recently conducted a study in which we aimed to shed light into this new problematique. We identified main data protection gaps for data subjects (students and teachers) and the critical challenges universities should consider when relying on third-party service providers, namely: 1) allocation of roles and responsibilities of the actors involved; 2) the definition of legal bases and purposes of the processing, its transparency and possibility to effectively exercise data subjects’ rights; 3) extra-EU data transfers after Schrems II; as well as 4) e-proctoring systems. In this blog post we aim to shortly summarize our findings and conclude by proposing policy recommendations to overcome the identified critical points.
We essentially argue that the implementation of the right to privacy and data protection in the Emergency Remote Teaching (ERT) environment is not merely an issue of compliance, but a substantial measure that universities shall ensure. Data protection rules are conceived as a core facilitator for the building of the European digital education ecosystem. Ultimately, our study assumes that tomorrow’s high quality, open, and inclusive education will be based on today’s careful analysis and responsible data protection choices in the emerging digital education environment.
Data protection roles: Untangling the powers and responsibilities of actors involved in remote teaching
When a university relies on external platforms for building its RT infrastructure, the allocation of data protection roles among different actors becomes crucial, as it has significant consequences on the attribution of responsibilities and duties.
Teachers, universities, and platforms may perform diverse roles as data protection responsible actors, in light of the CJEU’s case-law (e.g., Fashion ID and Holstein) and of the recent EDPB guidelines 7/2020. Generally speaking, and according to article 4(7) and (8) GDPR which lays down the definitions of data controllers and data processors, universities are classified as controllers when they determine the means and the purposes of the data processing; in those cases, platforms are appointed as processors (see also our previous study). However, platforms may play the most determining role in the definition of the data processing agreement in practice. This would mean that while the university will formally be the controller, the platform exercises a substantial power in planning and designing the data processing. Moreover, online platforms often perform further processing on the data collected within an ERT context for their own purposes and determining the means of processing. Hence, they become autonomous controllers for that processing, opening up questions about the role and responsibility of the university in this additional processing.
Furthermore, the rapid shift toward ERT might have burdened teachers with more (legal) responsibility; in some cases, teachers choose autonomously online educational tools. In doing so, they may determine the purposes and the means of processing, becoming – often unaware– controllers.
Ultimately, it becomes apparent that the allocation of data protection roles among the participating entities has a significant effect on the architecture of ERT, and therefore on the shape of our education system. As the decisive role on the determination of means and purposes of data collection and processing vacillates between teachers, platforms and universities, the allocation of shared responsibilities would have to be integrated and reflected in the design of digital education tools.
Setting the boundaries of processing: Determining the legal basis and purposes and enforcing data subjects’ rights
As set out in article 5 GDPR, personal data shall be processed lawfully and according to specific purposes identified by responsible actors, the data controllers. The determination of the legal basis and purposes for each data processing activity according to article 6 GDPR is key to establishing the lawfulness of personal data processing orchestrated by data controllers within the ERT context. These accountable actors, in turn, are responsible to ensure the overall respect of data subjects’ rights set out in chapter 3 GDPR, and to respond to data subjects’ requests to access, modify, erase, or modify personal data processed during ERT delivery. Thus, the articulation of accountable actors and respect of these GDPR enforcement principles are all essential in ensuring that boundaries are set to the personal data processing occurring.
Concretely, as controllers, universities are the ones to determine the purposes and the legal basis for processing. When relying on external platforms, they hence need to assess not only the suitability of these tools for educational purposes, but also of the data protection guarantees that the platforms offer. For instance, universities would have to carefully consider if these platforms process students’ or teachers’ personal data for independent purposes, which might not be compatible with the original purposes of data collection. Similarly, transparency is crucial in determining how data subjects can interrogate the platforms.
For this reason, we argue that associations representing groups of universities could support them in negotiations vis-à-vis platforms, and foster a substantial role for HEIs as controllers. The importance of teachers’ and students’ engagement in data protection issues related to ERT platforms cannot be overstated. Indeed, this collective data protection exercise would preserve transparency, accountability, and it would ultimately promote the educational and teaching values of each institution.
The Platformisation of Education: Life after Schrems II
The majority of platforms employed in ERT relies on cross-border data flows as a fundamental element of their digital service model, in particular for storage and maintenance purposes. Until recently, most of these platforms’ privacy policies referred to the Commission adequacy decision allowing EU-US data flows, i.e. the Privacy Shield, as the legal basis for these transfers.
After its invalidation in the CJEU’s Schrems II decision, ERT providers shall now find an alternative legal basis if they intend to transfer data towards the US.
The transfer tools envisaged by the GDPR, as Standard Contractual Clauses (SCCs), Binding Corporate Rules (BCRs), or the exceptions under art. 49 GDPR, are difficult to be implemented in the context of remote teaching services. For instance, universities might face difficulties when conducting assessments over relevant supplementary measures integrating SCCs. Also, BCRs would risk prioritizing and standardizing the reality of larger digital education service providers. Ultimately, the art. 49 GDPR exceptions are vague, which could come to legitimise extra-EU transfers without respect of the “essential equivalent” standard of protection.
For a correct implementation of the transfer tools, HEIs should hence renegotiate the terms of the processing agreement with exporting processors. Moreover, the international dimension of digital educational models and the frequency of resulting data flows, should garner greater attention by European regulators, so as to establish a clearer normative framework for the extra EU circulation of citizens’ data collected during educational activities.
E-proctoring: Is this the best we can do?
E-proctoring systems are another spin-off of platformised education. E-proctoring aims to ensure the integrity and validity of online exams by monitoring students via computers’ webcam and microphone. Considering their features, such tools are likely to be more problematic than in class invigilation in terms of privacy, students’ health, and discrimination. First, the exam is not held within University premises, but it is brought into students’ homes, this intruding one of the most intimate individuals’ sphere. Second, the inspection of the work station might be frustrating, uncomfortable and humiliating for some students. Third, the non-stop monitoring during the exam is likely to have a negative impact on the performance of particularly anxious students (see here).
In our previous work we echoed such concerns and pointed out the risks brought by the lack of accuracy of automated decision-making systems, and the problem of algorithmic bias. For instance, e-proctoring systems are already showing difficulties in correctly recognising black students (see here and here).
Although a first decision on the software “Proctorio” ruled that the e-proctoring can be implemented in compliance with the GDPR (see Rb. Amsterdam – C/13/684665 / KG ZA 20-481), we argue that less privacy-intrusive means to evaluate students could be implemented, at least in the medium- and long-term. For instance, structuring the exam in a way where pure notions and basic knowledge are not directly evaluated, could offer a better alternative that might render (human and machine) invigilation unnecessary. There are plenty of assessment strategies that can be successfully implemented as a take-home exam (essays, open questions, opinions, reports, interviews) or in the form of continuous evaluation (projects, case studies, group presentations, group works, artifacts, etc.). In these kinds of assessments, the student does not have to demonstrate that she “knows”, but that she critically masters the knowledge and uses the skills acquired during the course. Something that is usually the highest desirable learning outcome in all our syllabus. Such a typology of exam involves a more complex exercise, indeed, not only for the students, but also for the evaluators. It requires more time than a set of multiple choices and it is hard to manage with a huge number of students.
In other terms, there are already less intrusive means than e-proctoring to organise an exam session, but their implementation is going to require systemic changes. First of all, it means to continue/or increase the effort required for the education of our students to the value of academic integrity. The purpose of the university is also to form responsible citizens and accountable future professionals. Second, the above-mentioned non-invigilated exams require setting a reasonable ratio between the number of teachers and students. Allowing us a “physical” metaphor, the restructuring of the learning experience around the size of the class and not the auditorium will contribute to set the main condition for promoting a dialogue and a constructive exchange between all the participants.
The problem of class size, the possibility to have a truly formative experience also during the assessment, and the increasing workload for teachers, are all questions that have been with us for a while. The emergency situation we are experiencing is simply reminding us that those issues remain and that we should no longer postpone looking for an answer.
This work is part of a broader research in the platformisation of education (see, , here, here, here, here, and here). We have focused on the ERT’s data protection risks, and how these have been exacerbated by the pandemic in four key areas. We concluded with a first set of recommendations for ensuring a safer and fairer remote teaching environment, and presenting strategies beyond the emergency and beyond the mere compliance with the data protection framework.
First, when universities decide to rely on third-party services, adherence to the privacy and data protection framework by the platform is a necessary starting point. To this end, various actions could be taken at different levels.
National data protection authorities can play a crucial role here. For instance, they could: 1) scrutinise platform services tailored for education (for example, through direct investigations and global initiatives like the Global Privacy Enforcement Network “Sweep days”); 2) promote education and awareness campaigns about legal implications of RT for HEIs and the relevant stakeholders.
Furthermore, to remedy the informative and power asymmetries between universities and platforms, we argue that collective negotiations led by groups of universities may be a possible tool to enhance the bargaining power of HEIs and preserve their autonomy. Groups of universities could join forces when negotiating with platforms, proposing their own data protection conditions, shaped according to their cultural mission and educational needs. This collective negotiation may cover several data protection and privacy issues: the data to be collected, the data minimisation safeguards and features, the clear distribution of roles and responsibilities vis-à-vis the data subject, the location of the servers, the purposes of the processing, the eventual admissible repurposing, and data sharing.
Second, and in parallel, we argue the importance of seriously discussing the possibility to bring the digital infrastructure “in-house”. New public infrastructures, or the enhancement of existing ones, should be put in place, highly customisable at the local level, in order to preserve privacy, data protection, and educational diversity.
In any case, in our opinion, two fundamental aspects cannot be overridden in the design of the so-called “post-pandemic university”. First, the formation of the values that will govern the establishment of HEIs digital infrastructures must take into consideration data protection and rely on its inherent balancing exercises; this balancing will undoubtedly have a considerable effect in shaping digital education as a whole. In addition, collective empowerment of data subjects (students, teachers, admin staff) within the University and participation in the decision-making processes should be factored in as an important institutional step to shape ERT legal architectures according to universities’ cultural mission and educational needs.
Great read. The date for the decision in appeal in the UVA case is now set for 30 March. In the meantime the Norwegian DPA came down on an algorithm used by the IB grading system being biased and the Danish DPA allowed University of Copenhagen use of ProctorExam in so far as only used for live monitoring deciding that the university had considered each course exam to be subjected to this method, informed students appropriately and taken sufficient security measures and that live proctoring was the least invasive in the then circumstances April/start of pandemic). This last element is interestingly opposed to the view of the summary decision in Amsterdam, in which Proctorio’s computerized review was deemed less intrusive than live review. Missing 2 step authorization was a point of the Danish DPA’s concern. Again, this was not seen as an issue in the UVA case. In a recent development, in NL 2step authorization in edutech was addressed in the context of google’s offering as an important security measure to be added.