The increasing digitalization and datafication of education raise
pressing concerns linked to privacy and data protection. While
AI, data-based analytics and other
ICTs hold huge potential for the education sector and the creation of flexible lifelong learning pathways, they are also contributing to an increasing expanse of personal data processing.
Digital advancements are continually evolving, such as the recent development of
generative AI, opening-up not only new opportunities but also risks. The
Beijing Consensus on Artificial Intelligence and Education (2019) describes the potential for data to transform evidence-based policy planning processes such as
Educational Management Information Systems and to support adaptive learning processes, evaluation and assessment but also calls on States to ensure ‘ethical, transparent and auditable use of education data and algorithms’. Internet technologies are capturing incredible quantities of information and the types of data being collected are expanding to include administrative data (name, gender, attendance, scores), to data about learning processes (learning trajectory, engagement score, response times, pages read, videos viewed) and sometimes even biometric information, such as fingerprints or eye scans to access and sit examinations (UNESCO, 2020c). Sensitive personal data such as that on medical conditions, home situation, disciplinary measures or even immigration status increase the need for strong mechanisms of protection, yet children’s education data are far less protected than health data, which tends to be governed by a complex and comprehensive framework of regulations in many countries (Han, 2020). Aside from a security breach, the threats inherent in the widespread collection of learners’ data are wide-ranging: the profiling of learners; data used for non-educational and commercial purposes such as targeted advertising; a lack of transparency of algorithms and the bias and misinterpretations that can result; a lack of accountability and privacy intrusions where cameras and microphones are used in the private spaces of teachers and learners are just a few of the risks involved (UNESCO, 2022d). Furthermore, true consent is difficult to obtain in online education as existing power imbalances between EdTech, governments, children and parents are amplified (Cannataci, 2021). ‘Notice-andchoice’ regimes to garner consent are flawed- privacy policies are notoriously complex and if a service has been chosen by the educational authority, little choice is left to the learner to opt-out. Unfortunately, schools tend to select applications and tools based on curriculum and financial considerations, rather than privacy (Cannataci, 2021). Aside from data protection and privacy concerns, other risks are attributed to learners’ spending more time online. Children and youth engaging in online activities can be exposed to ageinappropriate or illegal content, inappropriate contact and risk of privacy due to publishing sensitive personal information (ITU, 2020). There are also concerns for children’s mental and physical health and well-being associated with heightened levels of screen-time.
The
UNESCO Institute of Technology in Education has developed a Guidance Handbook for students, teachers and parents, based on the ‘
Personal Data Security Technical Guide for Online Education Platforms’ launched by
UNESCO IITE and Tsinghua University in 2020. A
Recommendation on the Ethics of Artificial Intelligence has been adopted by UNESCO as a comprehensive global standard-setting instrument to provide AI with a strong ethical basis, though this instrument concerns AI exclusively and is not specific to education. In relation to education, it notably encourages the promotion of
AI literacy education as well as prerequisite
skills for AI education (including basic literacy, numeracy, coding and digital skills) alongside general awareness programmes, that cover the impact of
AI systems on human rights and their implications. The Recommendation furthermore highlights the need for the
ethical use of AI technologies in teaching, teacher training and e-learning and the need to ensure inclusive participation. With regard to the protection of children and young people online, industry-led projects may also provide inspiration. The Tech Coalition is an alliance of global tech companies who are working together to combat child sexual exploitation and abuse online. They provide resources, education, and capacity-building to tech companies, and serve as a resource for external stakeholders. Coalitions of private companies that opt-in might not provide the most exhaustive protection for learners’ rights, but they are efficient, flexible, self-regulating and are not limited by borders or government priorities.
Expansion of the legal framework.
Any new regulations must be capable of accommodating shifting and evolving technologies and
provide adequate legal protection around consent, data processing, data security and transparency.
Ideally, data protection authorities should be established for enforcement.
Throughout the consultative process data protection and privacy were raised within the context of
digital learning. This subject was considered absolutely key to future reiterations of the right to
education, especially how data is collected, by whom, and the child’s right to be forgotten.
Participants stressed that teachers should also have rights to data protection and privacy. No
consensus was reached as to the way forward for regulating this sphere, though there was consensus
that marketing and collecting data for commercial purposes should be prohibited in education.
Article 12 of the UDHR, article 17 of ICCPR, article 16 of the CRC and multiple other international andregional human rights instruments recognize privacy as a fundamental human right, however there is
no explicit legal protection for personal data. Currently, data protection principles are instead the
focus of regional or multilateral bodies and organizations such as the Global Privacy Assembly (GPA)
to the European Union, the Council of Europe, the OECD and more recently the African Union and
the Asia Pacific Economic Cooperation (APEC). At the national level, 137 States out of 194 have
legislation on the topic of data protection and privacy (UNCATD, n.d.).
Some feel that the protection of online data should be recognized legally as a part of the existing
right to privacy in international human rights law. The Special Rapporteur on the right to privacy has
frequently taken on the topic of online data protection as part of their remit, showing a conflation of
the two subjects, in recognition of the principle that rights people enjoy offline should also be
protected online (Cannataci, 2021; Nougrères, 2022).
Others are advocating for the enshrinement of a new distinct, fundamental right to data protection.
This move would allow for the development of distinct core principles as they relate to data
protection. It would provide weight to the argument that governments should be putting in place
more protective legal and political frameworks for learners in the digital space, as recommended by
the 2012 OECD Recommendation on the Protection of Children Online.
Some of the core principles that might form the basis of the right have been developed through
international dialogue, such as a duty to: obtain personal information fairly and lawfully; limit the
scope of data use to its original purpose; ensure that processing is adequate, relevant and not
excessive; ensure its accuracy; delete it when no longer required, and; grant individuals the right to
access their information and request corrections (Scheinin, 2009).
Specific rights for the purpose of child online protection should also be envisaged. Promising legal
developments are happening in some parts of the world. In the UK, the Age-Appropriate Design Code
(2020) contains 15 flexible standards to build-in protection for children online, for example settings
must be ‘high privacy’ by default, children’s data should not be shared and geolocation settings
turned off by default. The code is binding on all online services ‘likely to be accessed by children’ and
is enforced by the Information Commissioner. State legislators in the California are using the UK
statutory code as the template for a bill they hope to pass to protect children online (Lima, 2022).
Comments
Post a Comment