top of page
Caroline Keen

Apathy, convenience or irrelevance? Identifying conceptual barriers to safeguarding children’s data.

Updated: Nov 14, 2021

Caroline Keen First published September 24, 2020. Research article. https://doi.org/10.1177/1461444820960068


Corporate surveillance by global tech firms has created a world in which we might well question our ability to keep personal and sensitive information private. But how much do we know or care about it?

While some countries are flexing newly acquired regulatory muscle fining large global tech companies for collecting and monetising individual personal data, many users remain unaware of the longer term outcomes for children.

Global tech firms continue to disregard our privacy, finding new ways of monitoring the microdetails of our lives, actions, behaviours, choices, thoughts, physical and emotional wellbeing, in order to predict and control our behaviour, attitudes, consumption and actions. Despite this, most consumers continue to use online services, knowingly and unknowingly trading all manner of personal data in exchange for the convenience of targeted advertising.

Regulators in Western democratic countries have begun overhauling regulation to offer greater guidance to businesses collecting your data (see GDPR, California law changes, the New Zealand Privacy Act 2020), while also trying to offer greater regulatory protection to consumers, and in particular to children. However, while regulators set down privacy protections for children in some countries (there is no particular reference in New Zealand’s law to the protection of children’s privacy against corporate interests), the onus is on parents of younger children to provide consent for corporate data collection, and for teenagers themselves (in the US, UK and Europe aged 16 or as young as 13) to manage their data privacy. While this neoliberal model boasts consumer choice as a model for addressing a range of social problems (stranger danger, grooming, sexploitation, exposure to pornographic and violent content etc.) we need to ask if the issue of protecting children’s data can be added to the of responsibilities assigned to individual internet users, parents and children included.

It is generally observed that children have scant knowledge of commercial business models that mine personal details about them. Within the current policy environment, many are predisposed to the idea that education will fill the void and urge individuals, parents and children to address data privacy risks. But will increasing awareness among parents and teenagers about corporate surveillance, data profiling and its potential consequences and impact on life opportunities necessarily result in better protections for children’s data privacy? Will parents and teenagers then be motivated to manage privacy risks alongside the many other risks (media, content, communications) of being online?

In 2019 InternetNZ funded my research exploring this issue. This research sought to understand how parents and teenagers understand privacy in the digital context and in relation to corporate surveillance. Findings suggest that current neoliberal regulatory frameworks that obligate parents and teenagers to manage data privacy fall short of what is needed to protect children's data privacy. The findings outlined in my new publication "Apathy, convenience or irrelevance? Identifying conceptual barriers to safeguarding children’s data privacy" recently published in New Media & Society support the argument for stronger regulation and oversight over the data mining practices of global tech firms. Interestingly, since the introduction of the GDPR, several global events have led to increased lobbying for stronger regulation of the large global tech firms addressing a range of issues - from disinformation and its impact on democratic processes, the spread of conspiracy theories, the perils and misuse of AI and data profiling, and the spread of violent content and terrorist data being just a few. However, issues around protecting children online seem to rely on the model of user choice and parental mediation.

Much of the research record has conflicting reports about teen privacy practices, with some claiming younger generations are oversharing on social networks, and others claiming that teenagers are actively controlling who accesses personal information they choose to share online, and leveraging new technology platforms to navigate social norms among youth today. However, as researchers have been focused largely on the privacy practices of users, and particularly teenagers' use of social media, this has put teenagers' information-sharing practices under scrutiny, rather than the data mining practices of global tech companies. The key contribution of this research is that it offers an explanation as to why parents and teenagers alike, do not care about the corporate collection of information about them. It does this by focusing on how we conceptualise privacy harm, and particularly how we conceptualise this in the digital environment.


This research takes a step back to think about how privacy has been conceptualised in Western democratic countries and provides a framework for thinking about such concepts within the context of digital data collection. It explores how parents and teenagers understand their privacy in relation to others, institutions and corporations. By exploring how privacy and privacy harms are constructed by parents and teenagers, the research situates where they are most and least likely to actively manage data privacy online.

The research found that parents and teenagers are focused on immediate threats to their material interests, as well as their reputation, status and safety that could be caused by the unwanted sharing of their personal information. Such threats are highly visible in media and within local communities where parents and children are grappling with issues of online grooming, sexual exploitation, exposure to adult content, and cyberbullying. The unwanted sharing of information with 'others' and 'publics' can have immediate and serious consequences for some children and families. Personal information in the wrong hands, risks it's misuse by 'others' online. However, the concept of corporate malevolence when it came to personal information remained largely absent. No parents in this study currently evaluated the data policies of online platforms and websites they used, and they certainly did not consider that they ought to be doing this on behalf of their children. While some of this could be put down to a poor public awareness of corporate data practices, data profiling, and its use by private and institutional actors, the prospect of corporations knowing one's intimate details remains largely benign as parents and teenagers alike, positioned privacy harms within the public/private dimension embedded within social media platforms. As neither parents or teenagers believed that corporations would share personal information in such settings, they did not think corporations posed any threat to their physical, mental or reputational wellbeing. Without any immediate or visible harm in the public domain there was no motivation to protect themselves or their children against corporate data mining practices.


Ultimately, the bigger threat of corporate surveillance and data profiling practices is its ability to usurp individual agency and self-determination through its predictive functions which pre- determine opportunities, not only consumer choices, but individual access to a range of opportunities such as access to finance, insurance, education, work, social connection, knowledge and interests, thereby determining life outcomes for us. Prior to the digital age we had greater ability to withhold personal details and intimacies which might, if released to others, institutions or those in authority, might alter the balance of power to our disadvantage, the excessive data surveillance by corporations poses significant challenges to maintaining what we might call our ‘decisional privacy’.


Although heightening awareness of the pitfalls of corporate surveillance for users is an admirable and still useful exercise in creating civil concern and amplifying regulatory pressure, it is unlikely that parents will step up to review the privacy policies and terms of consent of online social media and websites on behalf of their children, or that teenagers will be motivated to leave those platforms whose mission it is to mine personal information about them. To sum up, this research allows us to make sense of privacy practices through a conceptual lens and provides answers to the issue of user apathy toward corporate surveillance. It argues that the current neoliberal model of governance whereby global tech data practices flourish in a largely unregulated environment, and users, parents and teenagers are made responsible for managing online data privacy risks, will ultimately fail in the absence of a conceptual shift in the way we think about privacy harms. _________________________________________________________________________________ Author bio: Dr Caroline Keen is a Digital Sociologist and consultant whose research and expertise navigates social issues arising from new digital media and technologies within neoliberal regulatory environments. She is a qualitative specialist whose research has focused on the social impacts of corporate surveillance, internet regulatory frameworks, pornography regulation, and digital exclusion, in relation to children and families.

Thankyou to InternetNZ for funding this research, and Professor Alan France and Associate Professor Janet Sayers for advising on the project. A special thanks to those parents and teenagers that agreed to interviews.

コメント

5つ星のうち0と評価されています。
まだ評価がありません

評価を追加
bottom of page