top of page
Caroline Keen

How much student data is being collected by EdTech in New Zealand schools?

The answers to these questions remain hidden from view. It's not easy for parents or students to find out what information is collected from them throughout their children's education. In fact parents and students may never know the full scope and nature of data being collected from children or how this may be shared, used, misused, sold to commercial data aggregators, breached or hacked, during their school careers.


And while there has been little motivation by Government or educators in New Zealand to look into the issue of student's data privacy, other countries have been actively looking at how to safeguard children's personal data within schools. This is an issue that needs urgent attention in New Zealand.




What risks do EdTech pose for students?

EdTech companies are able to collect unknown quantities and types of personal data from child users during their learning and use this for commercial purposes.


A recent Human Rights Watch (HRW) report shows most governments endorsed EdApps during Covid putting children’s privacy at risk.


The report states that around 80% of these EdTech apps install tracking technologies on children's devices that track them outside of the virtual classrooms and across the Internet over time. Many sent or granted access to children’s data to third-party companies (AdTech) where behavioural and predictive analysis take place and this data is then sold to anyone.


Another recent report by the UK Digital Futures Commission (DFC) examined just two EdTech services, Google Classroom and Class Dojo.

Apart from concern around issues of transparency within contracts, the report notes that Google Classroom occurs when children leave the core service and link to other Google commercial services, such as Google search, maps, YouTube and other commercial applications. In one test a child clicking on teacher recommended resources in Vimeo and Youtube resulted in cookie surveillance by 92 companies. They also raise issues around children's privacy in relation to the ClassDojo application as it involves drives behavioural profiling and social scoring, and is ripe for misuse.


What does this mean? Consequences

The consequences are longer term: Children’s behavioural and predictive data will be used by companies and institutions in their future as adults and may work against some. For instance, children s educational and developmental records may be accessible to prospective employers, or universities, insurers, or banks, government or law enforcement etc. at the click of a button, allowing them to make decisions as to the individuals eligibility. This will result in unfair discrimination for some. But more concerning is that individuals will not know about the data being used to make decisions about them, and therefore have no way to contest those decisions. Basically, today’s children will have less self-determination over their life choices and opportunities.


What data privacy rights do children have?

There have been several regulatory and policy advancements across the privacy spectrum internationally that address children's rights to privacy from data collection by commercial or institutional actors. More national and international organizations are now researching and lobbying for a strengthening of children's digital rights, and within that their rights to privacy in relation to commercial interests.


These emerging privacy frameworks aim to strengthen regulatory safeguards of children’s data by placing limits on data collection, providing governance and oversight, while also empowering parents and children to actively manage their data privacy through awareness and educational campaigns. While there is more in the pipeline on international fronts, here are some of the regulatory approaches addressing children's data privacy overseas.


European Union General Data Protection Regulation (GDPR)

For instance, children’s informational rights have been enshrined in the EU General Data Protection Regulation (GDPR) which came into effect on 25 May 2018. Recital 38 acknowledges that children:


“may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data.”


And that


“specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child.


It also introduced rights to erase children's data, and prohibited automated decision-making of children’s information without human oversight in areas likely to have real world consequences for a child.


Building on the self-management model, the GDPR introduces an age threshold of 16-13 whereby parental consent is required for the collection of younger children’s information, at the same time requiring that teenagers themselves to make their own decisions about whether they wish to ‘consent’ or ‘opt-out’. These new guidelines apply to EU member states, but have stimulated other national and international initiatives strengthening children's digital rights.


The United Nations, General comment 25

On the 4th February 2021, the United Nations Committee of the Rights of the Child (UNCRC) approved General Comment 25 which outlines children’s rights in relation to the digital environment.

Along with children’s rights to protection, participation, access to information, and expression, this document strengthens global and member state commitment toward children’s informational privacy in the digital environment. It addresses the use of artificial intelligence (AI), surveillance technology, emotional analytics, and profiling that pose serious risks to individual privacy, addressing children's rights to privacy in particular stating that:

  • States should prohibit by law the commercial profiling or targeting of children of any age (and it adopts the United Nations of childhood as being up to the age of 18)

  • Prohibiting commercial neuromarketing, emotional analytics, immersive advertising

  • Stressed that any digital surveillance of children and automated processing of personal data should not be conducted routinely, indiscriminately or without the child’s knowledge or, parents consent (younger children)

  • and that organisations should take the least privacy intrusive route when dealing with children.

  • More specifically within education the General Comment 25 insists that:

  • Commercial products and services used in school settings should have ‘robust data protection’

  • Use of data in educational settings should be ‘transparent, accountable and subject to the consent of the child, parent or caregiver’

United Kingdom Data Protection Act 2018 and oversight through the

Age Appropriate Design Code

In response to the GDPR the UK passed the Data Protection Act 2018 incorporating these data regulations into law. They also established strong oversight as well as education for the public and businesses. The Information Commissioner's Office (ICO) promotes data privacy and oversees data privacy compliance with regulation generally and specifically relation to children. Education is central to a self-management model. Two years ago the ICO introduced the Age Appropriate Design Code (Children's Code), promoting industry measures to strengthen children’s data privacy as well as empower parents and children to have more control over personal data. It does this by:


  • adopting a ‘privacy by default’ approach when providing services to children.

  • minimizing data collection as it recognises not all data collection is good

  • have profiling and geolocation tracking default off

  • prohibiting nudge techniques

  • Emphasizing the need to mitigate risks to children

  • Calling for organizations to become ‘transparency champions’ giving parents and children more agency over data sharing.

California

Assembly Bill ('AB') 2273 for the California Age-Appropriate Design Code Act was passed, on 11 August 2022, and will now be considered in the State Senate. Following the UK the amended bill introduces requirements for businesses that provide online services, products, or features likely to be accessed by a child including ensuring that all default privacy settings offer a high level of privacy protection, and stipulates privacy information, terms of service, policies, and community standards be concise, prominent, and use clear language suited to the age of children likely to access that online service, product, or feature. In addition, the amended bill requires the conducting of a Data Protection Impact Assessment ('DPIA') before any new online services, products, or features are offered to the public, noting that the documentation of this assessment should be maintained as long as the online service, product, or feature is likely to be accessed by children. Furthermore, the amended bill states that any violations will be subject to an injunction and liable for a civil penalty of not more than $2,500 per affected child for each negligent violation or not more than $7,500 per affected child for each intentional violation.


New Zealand

However, in New Zealand there has been no effort to adopt or promote children's digital rights and particularly their data privacy at home or within schools.


The privacy campaign being run by the Children's Privacy Working Group at the New Zealand Privacy Foundation have yielded little recognition or action by Government and Crown actors to review EdTech in New Zealand schools.


Additionally, in New Zealand there has been no formal acknowledgement of children's privacy rights as set out by the General Comment 25 by either government or crown agencies who might bear some responsibility for addressing children's digital rights, and even less acknowledgement of the need to protect children's data privacy from commercial or institutional interests.


Existing legal, government and education frameworks in New Zealand do not currently prioritize children as deserving of any special attention in relation to their data privacy. In part this may be due to New Zealand policy-makers having outdated understanding of what personal information is in the digital context, or a resignation that "the horse has bolted" and the job of reigning in global and local tech companies to ensure regulatory safeguards for children it just too hard.


We need to acknowledge that schools have become commercial spaces and that this exposes children to commercial exploitation. The adoption of private software and data management services, and particularly EdTech services allow commercial access to children's educational experiences and outcomes, as well as their private lives beyond the school gates.


Start a conversation about children's data privacy by watching our animated video explainer

New Zealand parents know very little about their children’s exposure to data collection through commercial services within schools, and rarely are informed or given the choice to consent or opt out. In part this is because we are still thinking about student data in more basic terms such as the data parents and children knowingly give schools, and data the school itself generates about a student such as truancy, counseling services, health services, academic grades, sports achievements etc. We don’t yet think about meta data and data analytics being used by the EdTech software now embedded within schools.


We need a better understanding of what personal data is in the digital environment? To this end Sociodigital Research has put together a 4 minute animated video explaining data, and data collection, which you can watch here.




What can we do to protect our children's data privacy?

Ideally, parents and children should not be put in a position of having to forfeit valuable online learning tools because government and EdTech companies fail to ensure their privacy is protected. However, schools are largely unaware of the privacy risks that EdTech pose to students.


We need to develop and enforce child specific data protection law and ensure compliance. This is because even if schools or governments audit EdTech against NZPrivacy Act this will not address children's rights specifically, or bring us closer to international regulatory frameworks.


We need to encourage data minimization, ensuring that analytics are only used for educational purposes only, and that such developmental and learning data not be sold or used for commercial purposes outside of their education. We need a discussion about transparency and issues of consent, along with raising awareness, empowering parents, teachers, and students to manage data privacy in an informed way.


For further information c



ontact: Dr Caroline Keen, hello@sociodigitalresearch.net


Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page