Thursday, May 26, 2022

Children's Right to Privacy, Education, and COVID

"Never let a good crisis go to waste" Sir Wiston Churchill
road sign-based School Children watched by cameras. Copyright 2022 Mauricio Tavares, Privacy Test Driver

Cases where educational institutions in more technologically advanced nations used third party apps to monitor students are not new. The first one that comes to mind is from 2010, when the Pennsylvania school district had been sued by turning the camera in the laptops they issued to the students on. If we fast forward to these post-GDPR times when data privacy is rising in importance amongst businesses and government, we find this habit is still alive and well:

Yesterday the Human Rights Watch announced the completion of a global study run between March and August 2021 regarding privacy and online learning platforms. In this study, 164 education technology (EdTech) products -- websites, apps installed in laptops and phones, etc -- used by 49 nations were considered. Their conclusion is most of these products either put children's privacy and other rights at risk or directly violates them. At best this data is being sent to companies that collect and sell personal data: AdTech companies

To understand the impact of these findings, we should take a look at the list of 49 countries that were subject to this work. It contains countries which are supposed to have strong privacy laws:

If we were to pick one, GDPR, we see it specifically describes not only what is legitimate case to process data (Article 6), when to get consent to collect said data (Art 7 and Recital 32), how to deal with children's consent (Art 8), and what needs to be considered when handling children's personal data (Recital 38). In other words, there were laws and regulations in place to protect this data. So, why that did not happen?

When COVID mandatory lockdowns started, schools switched to online means for presenting their lectures to their students. Some schools were already prepared -- they already had a online system in place -- but the majority had to struggle to find a solution. With the help of their government, they selected EdTech products they thought were best suited to their requirements. In their hurry, they did not do a proper Data Privacy Impact Analisys on these products, specially as applied to children's personal data. And, these programs require children to surrender their personal data or they will be reported as absent and possibly dropping them out of school as a result.

We do not know if this data collection code was accidentally added to the app or not, but we think the solution is simple: the EdTech companies have to

  • Ask the children (and their parents) for consent. And by that we do not mean the implied consent common with badly written software, but proper consent as defined in, GDPR articles 7 and 8.
  • Be transparent about how the collected personal data is to be used.
  • Ensure their applications' features should still work even if consent to use biometrics is not provided. In other words, it should have an alternative authentication method. From a programming best practices point of view, this means to abstract the authentication instead of having its code interwined with the rest of the program. And that means it can be updated to the next technology. Good programs do this already.
If students (and/or their parents) chose not to allow their biometric data (facial recognition in this example) be used, they should not be penalized (reported as absent), and the school must provide another, less intrusive, means of identification. It also needs to explain to vendors that not being able to offer an alternative authentication system will seriously hurt the chances of their online educational product being selected. COVID is not an excuse to suspend children's privacy and security.