Administrative data is required for the school to carry out its duty of care, as educators and caregivers during school hours and on school premises. Students need to enrol, choose classes, access guidance counseling, libraries, and health services, they may require special educational needs due to learning or behavioural issues requiring special attention, and their attendance is recorded. For the most part parents and children knowingly and willingly provide this sort of information to schools. However, schools now outsource student administration, management systems, through a variety of global and local commercial providers (i.e. Google Suite for Education, Microsoft for Schools, KAMAR, PCSchool, Musac, SeeSaw or Classdojo). Schools can find themselves sharing student data when outsourcing technical support and data security adding further privacy risks, and many providers share data with its affiliates and third parties. Often use of the system requires users to consent to data being collected about them but parents and students generally have no knowledge of what data is collected and who has access to it.

Global concerns about children’s data privacy has led to a number of national and international regulatory instruments. Thus, the UN Convention for the Rights of the Child (UNCRC), General Comment 25, the EU Global Data Protection Regulation (GDPR), European Convention of Human Rights (ECHR), the US Children’s Online Privacy Protection Act 1998 (COPPA), the proposed Protect Kids Act in the US have made significant progress in this area and signal its growing importance. This is because they acknowledge that children ‘may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data’[i] and affirm that states have an obligation to protect children’s privacy in digital contexts.
While my talk introduced a subject not yet on the policy radar in New Zealand, responses from conference participants confirmed that children’s data privacy is clearly compromised in the current educational setting. Chatting with teachers and other non-profit organizations revealed that they had not yet considered children’s privacy in terms of ‘data’ generated by educational software and applications throughout their school experience. Privacy awareness is often limited to issues of student safety and conduct online. Discussion with IT managers revealed that there were no school policies being applied to specifically address where children’s data is collected by EdTech companies.
EdTech a Covid-19 Growth Industry
We know that youth represent an sizeable consumer market as they are early adopters of digital technology and online services such as social media, film, music and games, but commercial interest in children’s educational experiences is also a thriving market currently experiencing CAGR[ii] of around 20% and the global market is expected to reach USD 377.85 billion by 2028.
The collection and use of student data in the education settings is framed as a digital learning revolution, as it promises to democratize education by making it: more accessible, personalized, immersive, automated, and based off data driven insights. Digital technologies promise 21st century competencies required to participate successfully in a digital society (Reagan and Bailey, p 9.). Consequently, there has been a tremendous shift taking place in the education sector from conventional exams and in class learning, to a personalized and interactive learning approach through the digitization of student experience. Covid has further amplified the integration of data technologies in education increasing privacy risks for children. Venture and equity financing for education technology start-ups more than trebled during the pandemic to $12.58 billion worldwide in 2020.
Without doubt, the push to adopt EdTech during the pandemic has significantly increased the collection of children’s data globally and through increasing remote surveillance of students.
The Datafication of student experience increases threats to data privacy
Customarily, schools are perceived to have a duty of care for ensuring children’s safety and wellbeing while at school. In interviews with parents and teenagers about data privacy in 2019, it was clear that they expect schools to collect information that pertains to their enrolment and safety, including details like name, gender, age, address, parent name, contact phone and address, email, as well as health, disability, or behavioural information in order to ensure their wellbeing while at school. Parents and students consent to this information being collected by the school, in order that the school can carry out their ‘duty of care’ (Keen, 2019).
However, student data now collected and generated through schools is now far more extensive, and detailed as schools digitise administration, teaching, assessment and other non-educational services.[iii] The internet, digital technologies, and big data analytics now underpin the student experience, largely without the student’s parent’s and often school’s knowing or understanding the risks. Parents and students are largely unaware of data being captured or generated about them in the school environment. Further, they often believe that there are regulations to prevent the collection and misuse of personal data when interacting with companies and institutions online (Keen, 2020).
The Privacy Implications of EdTech
Administrative data is required for the school to carry out its duty of care, as educators and caregivers during school hours and on school premises. Students need to enrol, choose classes, access guidance counselling, libraries, and health services, they may require special educational needs due to learning or behavioural issues requiring special attention, and their attendance is recorded. For the most part parents and children knowingly and willingly provide this sort of information to schools. However, schools now outsource student administration, management systems, through a variety of global and local commercial providers (i.e. Google Suite for Education, Microsoft for Schools, KAMAR, PCSchool, Musac, SeeSaw or Classdojo). Schools can find themselves sharing student data when outsourcing technical support and data security adding further privacy risks, and many providers share data with its affiliates and third parties. Often use of the system requires users to consent to data being collected about them but parents and students generally have no knowledge of what data is collected and who has access to it.
EdTech providers now collect more data, and use data analytics to provide personalized instruction, creating individualised learner profiles to do so. These software programs are increasingly adopted by teachers to help them to direct student work, and to automatically adjust to student needs through profiling student capabilities. Problematically, learner profiles are based on proprietary algorithms, and schools may have little control or knowledge of how these assessments are constructed.
It’s now regular practice for students to complete homework assignments, to work collaboratively, and to engage with teachers and classmates using digital technologies. Global digital services like Google, Apple, Microsoft, Adobe, Zoom, YouTube, Facebook,[iv] and a host of other apps and services provide infrastructure to support school administration, teaching and learning, communications, work, instruction, and assessment. Although we have embraced such services these Global tech companies have a string of legal challenges in their handling of children’s data.[v] For instance, in New Mexico Google was accused of collecting excessive data on children, tracking school children under the age of 13 ‘across the internet, across devices, in their homes, and well outside the education sphere, all without obtaining verifiable parental consent’ (required under COPPA in the US). The extent of information collected may surprise many readers. In this case Google was charged with collecting student physical locations, website browsing, content viewed on YouTube (owned by Google), all Google searches and selected results, personal contact lists and passwords, biometric data including voice recordings. Despite such privacy concerns, Google won by arguing that they as an online service provider were permitted to use schools as intermediaries for parental notifications and consent, and that it had secured permission to gather student data, thereby absolving the organization from being accountable. This highlights the ambiguity of consent and raises questions about obtaining informed consent, from whom such consent ought to be obtained, and the ability for individuals to opt-out from data collection if desired. Additionally, international and local privacy instruments urge limits on the secondary use of data and distribution to third parties. Running counter to this it would seem that ‘Google’s terms and conditions are increasingly being incorporated into the terms of separate services, including those offered by third parties’[vi]
A plethora of free apps poses heightened risks of third parties accessing children’s data
In talking with teachers during 2020 it is apparent that they regularly use free apps and web‐based learning tools to meet individual student needs. They spend considerable amounts of time learning new applications without professional training, and without conducting any Privacy Impact Assessments. As one teacher put it:
‘Along with Zoom there's always things coming up, you know our maths programs, Education Perfect, Maths Buddy. There's lots and lots of platforms coming out all the time that we use that we've got to learn really quickly, and we do that in our own time. And there's not necessarily PD around that. We've got to take the ownership of really going in and exploring that for ourselves.’
However, free apps usually have lower levels of privacy protection and specifically set out to monetize user data, selling this on to third parties. Increasingly EdTech is also marketed to parents through Apple and Google stores exploiting parent’s anxieties about children’s educational progress.
Apps commonly collect personal data and children’s data is no exception. Take for instance the recent media report that found 60 per cent of mobile applications commonly used in US schools were transmitting student data, largely without their knowledge. Previous studies have highlighted that many apps collect data from children sometimes unknowingly,[vii] indicate that even developers may not be aware of what data is collected and how it is handled as they bundle SDKs (software development kits) which frequently collect data for secondary purposes and shares with third parties. [viii] Even more sobering is that in an App Census completed for the Australian Competition and Consumer Commission (ACCC) it is revealed that popular apps such as Google, SeeSaw, Classdojo, Minecraft, Educational Puzzles, Mathway, were among the top five that requested access to sensitive user information termed ‘dangerous permissions’[ix]
But wait there’s more….the collection of children’s biometric data is increasingly normalised during Covid-10 pandemic
Research in 2019 found that parents and teenagers typically prioritized the security of their biometric data, and did not want companies to collect face, fingerprint or voice recognition data (Keen, C. 2020). We should therefore be concerned about the use of biometric data in education. For instance, in Australia it is now a regular occurrence to use facial recognition in schools and universities, for recording attendance at classes[x] and emotional surveillance during class.[xi]

(picture sourced CNN business)
But even more concerning is the recent development of facial recognition to monitor student’s emotions while they are doing their work at home.[xii] Technologies take control of devices in the home, deploy cameras, and collect behavioural data which is thought to assist teachers in identifying a student’s learning capacity. Aside from the obvious surveillance of individual students in homes, the scientific rigor of reducing human emotion to 7 categories (see picture) and how this might improve teaching is questionable. Sometimes we simply need to question the logic behind the adoption of new technologies by asking whether new technological innovations add anything to existing teaching and learning outcomes, and especially consider where such innovations may unfairly discriminate against some students.
What this means for New Zealand Educators and Policy Makers
New Zealand educators and policy makers have yet to consider the nature and magnitude of data that is now amassed within the now privatised school setting in which external businesses implement many aspects of the school administration, teaching and learning, and assessment. Despite the growth of educational apps and the distribution and uses of student data across schools, government, and commercial organisations, parents and students are rarely made aware of data being collected and its potential consequences, and their consent is rarely obtained.
While privacy laws in New Zealand[xiii] and Australia[xiv] continue to adopt a broad ‘principled’ approach that does address some issues around informing individuals when collecting data, access and rights to correct data, and minimise use, these are very vague and do not provide specific protections for children. They do not yet reflect the integrity and specificity of legal instruments developed across the EU, US, and global guidance such as the UNCRC and General Comment 25 which bolster children’s rights to privacy and data protection in the digital environment.
Although the New Zealand Privacy Act 2020 currently does not clearly address issues of consent, profiling, and secondary uses by third parties, excessive data collection, rights of erasure, or prohibiting the collection of children’s biometric data, we need to consider these risks and provide better legislative and regulatory guidelines to keep up with international regulatory trends.
At the very least, schools will need to consider Privacy Impact Assessments that address the key principles outlined in the new Privacy Act 2020 in relation to data they specifically collect, use and store about children and families, but this is not enough. We are now obligated to consider international trends and legislative developments that specifically address children’s rights to data privacy. Reviewing current data ecologies that flow between schools, communities, parents, and students will allow us to fairly address children’s data rights; ensure that data collected remains secure and confidential within school settings; and ensure data has a valid educational purpose that does not unfairly discriminate against some students. We need to ensure greater transparency providing information about data being collected, as well as alternatives for students and parents who do not wish to use apps and services that breach their privacy expectations. Inquiries to:
Dr Caroline Keen
CEO, Keen Initiatives Limited
Mobile: 027 275 8585
Website: www.socialresearchnz.com
Additional resources:
[i] GDPR Recital 38 [ii] Compounded Annual Growth Rates [iii] Polonetsky and Jerome, (2014), Student Data: Trust, Transparency and the Role of Consent. Future of Privacy Forum. [iv]Other successful prosecutions brought about through the FTC in the US are: Google and YouTube fined $170 Million for Alleged Violations of Children’s Privacy Law, 4 September 2019. [v] https://www.theverge.com/2019/12/5/20997199/tiktok-bytedance-musically-lawsuit-coppa-settlement-children-data. [vi] Digital Platform Services Inquiry: Interim report (2020). Retrieved from Canberra, Australian Capital Territory: https://www.accc.gov.au/system/files/ACCC%20Digital%20Platforms%20Service%20Inquiry%20-%20September%202020%20interim%20report.pdf [vii] Reyes, I., Wijesekera, P., Reardon, J., Elazari Bar On, A., Razaghpanah, A., Vallina-Rodriquez, N. and Egelman, S. (2018). Won't Somebody Think of the Children? Examining COPPA Compliance at Scale. Proceedings on Privacy Enhancing Technologies, 3, 63-83. [viii]1,000 Mobile Apps in Australia: A Report for the ACCC September 24, 2020:21 Retrieved from 1,000 Mobile Apps in Australia – A Report for the ACCC, AppCensus_0.pdf [ix] AppCensus, 1000 Mobile Apps in Australia: A Report for the ACCC, 24 September 2020, pp. 38–43. see figure 15 and 16. [x] Selwyn, M. A. N. (2020). Facial recognition technology in schools: critical questions and concerns. Learning, Media and Technology, , 45(2), 115-128. doi:10.1080/17439884.2020.1686014 [xi] https://www.ellucian.com/emea-ap/insights/facial-recognition-can-give-students-better-service-and-security [xii] https://edition.cnn.com/2021/02/16/tech/emotion-recognition-ai-education-spc-intl-hnk/index.html [xiii]https://www.legislation.govt.nz/act/public/2020/0031/latest/LMS23223.html [xiv] Archbold et al. 2021.
Comments