"Never let a good crisis go to waste" Sir Wiston Churchill
Cases where educational institutions in more technologically advanced nations used third party apps to monitor students are not new. The first one that comes to mind is from 2010, when the Pennsylvania school district had been sued by turning the camera in the laptops they issued to the students on. If we fast forward to these post-GDPR times when data privacy is rising in importance amongst businesses and government, we find this habit is still alive and well:
- In 2019 the first ever fine issued by the Swedish data protection authority was against a local school running a trial facial recognition system on 22 of its students.
- In 2020 Google was sued for collecting Illinois student's biometric data, which would violate the state's Biometric Information Privacy Act (BIPA).
Yesterday the Human Rights Watch announced the completion of a global study run between March and August 2021 regarding privacy and online learning platforms. In this study, 164 education technology (EdTech) products -- websites, apps installed in laptops and phones, etc -- used by 49 nations were considered. Their conclusion is most of these products either put children's privacy and other rights at risk or directly violates them. At best this data is being sent to companies that collect and sell personal data: AdTech companies
To understand the impact of these findings, we should take a look at the list of 49 countries that were subject to this work. It contains countries which are supposed to have strong privacy laws:
- Brazil (LGPD)
- France, Germany, Italy, Poland (GDPR)
- Japan (APPI)
- United States (FERPA, CCPA, COPPA, BIPA, etc)
When COVID mandatory lockdowns started, schools switched to online means for presenting their lectures to their students. Some schools were already prepared -- they already had a online system in place -- but the majority had to struggle to find a solution. With the help of their government, they selected EdTech products they thought were best suited to their requirements. In their hurry, they did not do a proper Data Privacy Impact Analisys on these products, specially as applied to children's personal data. And, these programs require children to surrender their personal data or they will be reported as absent and possibly dropping them out of school as a result.
We do not know if this data collection code was accidentally added to the app or not, but we think the solution is simple: the EdTech companies have to
- Ask the children (and their parents) for consent. And by that we do not mean the implied consent common with badly written software, but proper consent as defined in, GDPR articles 7 and 8.
- Be transparent about how the collected personal data is to be used.
- Ensure their applications' features should still work even if consent to use biometrics is not provided. In other words, it should have an alternative authentication method. From a programming best practices point of view, this means to abstract the authentication instead of having its code interwined with the rest of the program. And that means it can be updated to the next technology. Good programs do this already.