18 September 2019 — True Publica
This article is part of a series dedicated to a significant and wide-reaching BigBrotherWatch publication focusing on state surveillance that has been largely ignored by the mainstream media. Today, it is a very serious worry that our entire mechanism of democracy is being undermined by excessive and uncontrolled state surveillance. This disproportionate obsession by the government inhibits the fundamental ability of campaigners such as human rights activists, civil liberty experts and even local non-violent protestors to exercise their rights and has the potential to stall social change.
The report covers state surveillance in areas such as policing, blacklisting, investigative journalism, legal (profession) privilege, vulnerable groups, peaceful protestors and campaigners, schools, immigration, health, welfare – in fact, almost every corner of civil society. This is an excerpt dealing with the ‘chilling effect’ of surveillance on our children. We will be publishing parts of this report over the next few weeks.
Jen Persson founded ‘defenddigitalme’ in 2015, as a non-profit, data privacy and digital rights group led by parents and teachers with the aim of making all children’s data safe, fair, and transparent across the education sector in England. This is what Jen Perrson wrote in that landmark BBW report entitled – “The state of surveillance.”
Children in schools and young adults at universities are subject to state and commercial surveillance perhaps more than any other community in England. Visible surveillance tools like CCTV in playgrounds, corridors and private spaces such as bathrooms or highly invasive all-seeing classroom cameras are on the rise.
In addition, biometric systems are increasingly found in educational spaces. From handprint entry readers in an Oxford nursery, to fingerprints taken for cashless catering, tracking of library book loans, and access to lockers and printers in the majority of secondary schools, children are expected to hand over their sensitive biometrics from as early as age two. Thus, basic services like free school meals or libraries can become inaccessible unless they agree to this intrusion. Biometric systems are mainly installed with the intention to ease administrative burdens and to save back office costs when in reality, there is qualitative evidence that these measures fail to materialise or simply displace tangible costs.
While these surveillance methods are noticeable, it is the collection of children’s data that creates hidden surveillance far beyond the circle of people a child or their parents might expect.
The cost for individuals is not only a threat to their lifetime privacy but a chilling effect on participation, others’ perceptions of their potential, harm to young people’s trust in confiding in an authority figure, and expectations of a professional duty of confidence.
Unseen surveillance through data
There are several ways in which the current education system gathers sensitive information about young people. For one, state school administrative databases in England focus primarily on delivering a way to benchmark local organisations within the state so that national comparisons can be made over time, and across the sector. In the process, each stage of a child’s education is passed up the chain via a ‘Common Transfer File’ to the next organisation. The Department for Education (DfE) demands a huge volume of data about individual children from state funded schools and nurseries, three times every year in the School Census, and other annual surveys.
Schools’ internal information management systems predominantly managed by Capita SIMS in England, record a child’s name, date-of-birth, ethnicity, gender, and family address, as well as their behaviour and sensitive reasons for leaving the mainstream system for Alternative Provision, such as pregnancy or mental health. Special educational needs are also included in the national census.
This trove of sensitive data is increasingly used to link school records with other third-party datasets. Local authorities are joining educational data with information about individuals and households bought from data brokers like Mosaic or Acorn, to explore the possibilities of making early interventions based on algorithmically predicted behaviour. Researchers are also using national pupil data for predictive modelling to design classroom interventions based on children with a ‘certain’ profile. Research teams at the Ministry of Justice are now using sensitive school data, such as information about children in care, to fill in the gaps on criminal databases such as provided by the Police National Computer and identify absent fathers in the family justice system.
However, this linking of records creates new risks.
Reasons-for-exclusion labels from mainstream schooling can be interpretative and opinion-based but are widely shared in research, copied, distributed and treated for many years as facts on children’s records. The children themselves and their families have no idea who knows what about them.
This usage could have unintended consequences for communities, when models based on predefined comparative and collective characteristics target a group of people, particularly when no data accuracy checks are made. Like the example of the Troubled Families Programme shows, where families only have to match two criteria to be considered ‘troubled’, interventions are decided on a national level, even if carried out locally. Thus, surveillance through data can have effects with a permanency and authority that paper records previously did not have.
‘Dataveillance’ also stretches from schooling into higher education. Extensive data from Virtual Learning environments are combined with student enrolment data, to profile and predict behaviours and outcomes. Complaints from staff and students made to ‘defenddigitalme’ have included concerns about the breadth of data available to a wide range of staff, creating the risk of profiling and screening by ethnicity, religion, student and/or parental wealth which could have adverse effects on the treatment of students by staff, even if the intention was to be beneficial.
Commercial surveillance has become commonplace
The issue of dataveillance and sharing of sensitive data is not limited to public services and governmental institutions. As the volume of data about individuals has grown, companies that facilitate the gathering of data are eager to share it with third parties since data brokerage has become the primary business model for many.
Free-to-school apps are a pathway into pupils’ data and parental purchasing power. Too often schools assume parents will want to use these products and pass on personal data by registering every child and their family without asking. If a parent later objects to the initial sign up, it is often too late to prevent a private company from having rights to access and use the data. Parents and children have no idea how many apps and third parties track and profile their use of software inside or outside the classroom.
Most of the time, parents and primary aged pupils do not even have a choice whether or not to use these systems when the school makes purchasing decisions. Some become central to a teacher’s distribution of classroom materials, homework tracking and day-to-day activity, often without any school-level oversight.
The lack of knowledge and training around rights and responsibilities leaves a large gap for individual commercial exploitation through the introduction of new systems.
Children are profiled, tracked online, targeted by advertising, and their data used to develop products and ultimately increase profit margins, all without any digital understanding or awareness. Some of the information gathered is very sensitive as children can post their photographs or hobbies into profiles which are all available to external viewers.
Surveillance under the guise of safeguarding in schools
Children and young people should not find that software introduced for their safeguarding, causes them lifelong reputational risk and real harm. Yet this is the result for some children wrongly labelled as at risk of suicide or gang membership, and whose details are passed on to third-parties including the police, under the Prevent duty.
Under the guise of safeguarding, surveillance software on children’s school and home computers monitors what they do 24/7 every day of the year. Every keystroke is monitored and some software checks against libraries of over 20,000 watchwords. Every screen is captured. Some providers even permit the IT admin to operate a child’s web camera remotely and out of school hours when it is logged in to their school administered account for homework or when the child uses the laptop on the weekend or holidays.
Impero’s system even includes the word “biscuit” which they say is a term to define a gun. This potentially affects more than “half a million students and staff in the UK”. Currently there is no understanding or oversight of the accuracy of this kind of software and instead of implementing accountability, black-box decision-making is often trusted without openness to human question.
Error rates are opaque and system providers have little incentive to be transparent. Teachers concerned enough to contact us said they have children who search for something uncontroversial, the system flags it, and only allows the staff to make a ‘note’, that it was an error, but not delete the error. Companies have no incentive to lower their “success rate” of events captured.
In our research of over 400 schools in England, we are yet to find one policy that makes any mention of the supplier name, or what policy there is on profiling, keywords of third party access, retention, error rate, or course of redress. 84% of parents in the State of Data survey said they believe they should be informed which keywords get flagged, and 86% want to know what the consequences are — but do not currently know. The solution to many concerns about child safety, like self-harm or inappropriate content, are human and not one-size-fits-all technologies.
Social Media as a Surveillance Tool
Laws and regulations have been left behind, as monitoring young people’s social media in school and universities has become a common practice. Thus, protecting a young person’s social media from unwanted monitoring is very difficult.
The Student Loans Company has been accused in the summer of 2018 of accessing content students post on social media pages that are not restricted to private (such as an open Twitter feed) to identify fraudulent applications for funds available to those without family support.
Concerned parents have also contacted us about their own social media feeds being monitored when they found out that schools added photos of their teen at an anti-fracking demonstration, outside school hours, to the child’s school record.
The chilling impact this surveillance has on students’ and parents’ free speech, willingness to ask questions, and criticism is often embedded in school-home policy agreements, in which both parents and children are required to sign that they will not cause the school reputational harm.
Where do we go from here?
According to the Department for Education new technology is set to spearhead a classroom revolution. While the Secretary of State, Damian Hinds, believes only a minority of schools and colleges are currently taking advantage of these opportunities, both schools and the DfE are taking too little note of the risks and harm, and how to mitigate them.
“Children’s human rights to a full and free development, as established in the UN Convention on the Rights of the Child, are under unprecedented threat in England today.”
We can already start to see the dangers and harms to individuals and groups resulting from errors and bias in the data that feeds into the Prevent programme, and the exposure to exploitation risks. However, the long term impact of amassing children’s data today and the chilling effects of classroom surveillance may be yet to be felt, since the impacts are underresearched.
Political agendas change. Children’s Schools Census data has already been misused to identify undocumented migrant children and their family members. The UK has effectively registered all Roma families through their children’s school records — what if a future government decided on a Roma policy as discussed in Italy in the summer of 2018?
If a child is an undocumented migrant, an ethnic minority, a non-conformist, or simply not liked by staff, some of the current school software and surveillance systems are more likely to pick them out for intervention than their classmates. Systemic unfairness encoded into data and algorithms is given an authority it does not deserve. Data can be badly understood and result in harmful false predictions and mistaken conclusions. Children are, by default, being disempowered from understanding or correcting decisions and predictions made about them.
Profiling children by their search terms through web monitoring including remote camera access to children’s devices during and beyond school hours should be urgently reviewed, as we set out in our State of Data 2018 report. It should be of utmost priority that teachers and all educational staff are trained in digital literacy, data protection and privacy rights. Oversight and accountability, for both human and algorithmic usage of education data, need urgent regulatory attention. Children and young people must be given a better digital understanding of their own data if our future society is to flourish.