Does EdTech Disregard Students’ Digital Rights? — Observatory
It is known that technology within education has opened new panoramas and has optimized processes within the classroom. Learning has been supported by applications that facilitate teaching in various academic environments and social contexts. But what happens when its use contributes to improvements while users’ personal information is compromised?
In general, educational institutions determine the educational technology (EdTech) used in their study programs. However, the students at different training levels must provide their personal data to use these tools.
A report issued by Human Rights Watch (HRW) revealed that EdTech apps and websites used by students collected, monitored, and tracked the personal information of children in many countries, violating their digital rights. The study analyzed 163 educational technology tools used in 49 countries from March to August 2021. From Argentina to Russia, these nations adopted online learning as a component in their national plans during the pandemic’s school closures.
According to the report, 89% of EdTech products compromised children’s rights by collecting data without their consent or that of their parents, monitoring what they did in the classroom, who their family and friends were, and the devices their families could afford. The report showed that some companies market child and juvenile profiles to sell to advertising technology corporations (AdTech, Advertising Technology).
The report also revealed that 145 educational technology resources provided access to student data to 199 third-party companies, mostly AdTechs. Many online learning platforms allowed these companies’ algorithms to analyze information to determine the children’s personal characteristics and interests. This way, they could discover what influences them and predict their next steps.
Among the findings, it was detected that some technological resources were directed to students through behavioral advertising, targeting them from data extracted from their educational environments to personalize content and advertisements that led them through the internet. The companies extrapolated the children’s online experiences to influence their opinions and beliefs when they were at high risk of manipulative interference. Due to the school closures, students were required to use these products. Children whose families managed to pay for internet access and electronic devices were exposed to practices in these applications that violated their privacy.
Most education technology companies did not disclose their data surveillance practices, as only 35 of the total products stated in their privacy policies that their users’ information would be used for behavioral advertising. Twenty-three products were designed and developed for minors as the primary users.
Before this, HRW had established that according to the principles of child data protection and the human rights responsibilities of corporations described in the United Nations Guiding Principles on Business and Human Rights, EdTech and AdTech companies must not collect or process data from children for advertising purposes. Companies must inventory the data collected during the pandemic to ensure they do not process, share, or use it. They must work together with governments to immediately remove the information received.
Who bears responsibility?
Although there is a responsibility on everyone involved, the one who makes the technological decisions must make the commitment. An article in The Conversation explains that schools must determine the digital technologies to be used by their students. Students have no option to choose or not to use the apps or websites selected for them by educational institutions or departments. Children are not trained to make informed decisions about their online learning.
The study conducted by Human Rights Watch noted that seven countries (Australia, Brazil, Canada, Germany, India, Spain, and the United States) delegate decision-making to education authorities at the state or regional level. Throughout the pandemic, these resolutions included defining which EdTech to endorse or acquire for school use.
Jonathan McCully indicated in Digital Freedom Fund that large corporations dominate the education technology market, with products implemented in environments where children’s digital rights are poorly enforced without oversight, autonomy, or meaningful control over their information storage.
Educational contexts have been transformed by reliance on systems that process sensitive biometric data, such as facial mapping or fingerprints, to measure attendance, lunch payments or reinforce security. Some systems, such as those based on statistical models, algorithmic profiles, and automated decision-making, can perpetuate discriminatory and exclusionary measures. How? By developing curricula, predicting academic performance, and “detecting” cheating in exams, large corporations, and educational institutions can influence a person’s formative years and deny students access to different educational opportunities.
Costa Rica offers one example illustrating this landscape with the Strengthening of Learning for the Renewal of Opportunities (FARO) tests. The Ministry of Public Education (MPE) violated the rights of thousands of underage students and their families through questions in the test applied to fifth-grade students in November 2021. After 15 complaints filed by parents, the Court judges concluded that their right to privacy had been violated by obtaining personal data through a mandatory test, providing access to information that was under special protection from the State. So, there are different instances where governments, schools, and parents play a dominant role in protecting privacy.
First, it must be understood that in Mexico, the General Law of the Rights of Children and Adolescents defines in Article 5 that children are under twelve and adolescents are between twelve and seventeen years of age. For international treaties, children are under eighteen years old. In addition, Article 76 of the seventeenth chapter of the Right to Privacy specifies that children and adolescents have the right to personal and family privacy and the protection of their data.
Concerning persons of any age, Article 7 of the General Law on the Protection of Personal Data Held by Obliged Subjects decrees that, as a general rule, sensitive personal data may not be processed without the express consent of its owner. In addition, this law states that in the processing of the personal data of minors, the best interests of the child and adolescent must be privileged in terms of the applicable legal provisions. Therefore, consent to access a person’s data must be obtained under all circumstances. This is vital and always a priority to ensure the safety and well-being of minors.
However, this authorization should be revocable. The AyudaLey (HelpLaw) blog on Data Protection explains that even if consent exists, it does not legitimize the excessive use and treatment of the data. Users must be allowed to rescind their consent to their personal information and delete the collected data. The blog warns that, for applications to comply with the regulations on protecting personal data, they must make known the purpose of data collection before their installation. Regarding minors, the blog advises that it is best to choose a restrictive method for processing information, not to use it for commercial purposes, and to refrain from containing details about family and friends.
Identifying the key concepts to differentiate a person’s data helps raise awareness about the information provided to specific applications, platforms, and third-party companies. According to the Institute of Transparency, Access to Public Information, Protection of Personal Data, and Accountability of Mexico City, personal data includes all information relating to the person and makes them identifiable. These include age, address, telephone number, personal email, social security number, CURP, and academic, work, and professional trajectories. This data cannot be transferable.
Personal data has susceptible information. According to the National Institute of Transparency, Access to Information, and Protection of Personal Data (INAI), this data “reports the most intimate aspects of people. Their misuse may lead to discrimination or put them at serious risk by including racial or ethnic origin; states of health (past, present, and future); genetic information; religious, philosophical, and moral beliefs; trade union membership; political opinions, and sexual orientation.” The most sensitive data demand special protection and care. Patrimonial or financial data describe people’s economic capacity regarding resources and ability to face debts.
All this type of information needs protection and safeguarding. Although the different educational actors and parents may be able to manage their personal data, they are also in charge of guarding the children’s data and teaching them the relevance of responsibility for their personal information.
McCully offers examples of binding cases that serve as a legal reference for action against children’s digital rights violations. Anne Longfield, England’s former Children’s Commissioner, initiated legal action on behalf of millions of young people against TikTok. In this “historic case,” the data of more than 3.5 million children in the UK, including their phone numbers, videos, location, and biometric data, were processed by the platform without sufficient warning, transparency, or legal consent. So, they should delete existing data and pay compensation that could ascend to billions of pounds.
In the Netherlands, a group of parents filed a court case against TikTok. The criticism of the application was that the social media platform collected more than necessary data without the corresponding permission. The filing also stated that the company did not specify how it used the information.
McCully even mentions that cases have been filed in several countries against YouTube, Google, Facebook/Meta, and creators of gaming applications for not respecting or keeping children’s privacy.
In the United States, the Department of Education is committed to protecting student privacy by enforcing the Family Educational Rights and Privacy Act (FERPA), which allows students to monitor their transcripts and requires school staff to safeguard them.
It also affects university students
The Chronicle of Higher Education analyzed the contracts between universities and vendors from five institutions that plan to test “metaversities,” which are digital, immersive replicas of their campuses that students will attend using virtual reality headsets. Among its findings, the Chronicle collected privacy and data security inconsistencies in the contract provisions regarding digital attendance in the classrooms. They did not mention third-party companies, including Meta, which will collect information from students during the two-year pilot.
In the study, privacy experts said it was troublesome “that universities embark on educational tech startups with an incomplete understanding about what technology can ultimately extract from their students, especially when numerous private entities are involved.” So, they advise that awareness is essential for the protection of students and the reputation of institutions before adopting these new educational technologies. They point out that universities have a moral obligation not to let the responsibility for data protection advocacy fall only on the students.
A guideline to follow is the one put forth by the sociologist Miguel Ángel Casillas Alvarado, who suggests the primary measures to address the digital rights of university students in Mexico. Based on the Bill of Digital Rights of the Government of Spain, he states that the right to freedom should be demanded, where universities guarantee their students’ data protection. In its third section, this bill establishes the right of freedom to protect data, stating, “Everyone has the right to be informed at the time of the data collection about its destination and purposes, to access the data collected that concern them, and to exercise their rights of rectification, opposition, cancellation, data portability, and right to erasure (right to be forgotten) in the terms provided for in national and European data protection regulations.”
What can be done?
Professor Sandy Keeter presents some tips for the various stakeholders in education. She proposes specific practices for teachers to ensure the data safety of their students and the institution’s integrity. For example:
Review the data privacy policies of the tools or applications used in the academic program, ensuring that these are approved and endorsed by the university.
Encrypt sensitive information in emails and assign specific folders, deleting them regularly so as not to leave student data vulnerable.
Do not provide access to academic records or speak publicly about them.
When training, use sample or false data to exemplify the content.
Password-protected computers and learning platforms should require a log-out when not in use.
Educate students about safe online practices and the proper use of technology.
For universities to guarantee confidentiality, they must protect student data by:
Supervising the activity in the university networks.
Training employees and providing support.
Reducing the amount of information collected and purging unnecessary information.
Providing the minimum level of access necessary.
Publishing policies, procedures, and protocols for data protection to make them known.
In addition, the Human Rights Watch report describes in detail a series of recommendations globally for governments, ministries and departments of education, education technology companies, and advertising technology companies, which include:
For governments: urgently facilitate redress for children whose data was collected during the pandemic and who remain at risk due to misuse and exploitation. Adopt specific data protection laws. Ensure that companies respect the rights of minors and are held accountable if they fail to do so. Request evaluations on child rights management in any public procurement process that provides essential services to children through technology. Prohibit behavioral advertising aimed at children. Prohibit profiling of children.
For ministries and education departments: Allocate funds to pay for services that enable safe online education; do not permit the sale and sharing of children’s data to fund services. Provide confidential, age-appropriate, and child-friendly complaint mechanisms, access to expert help, and provisions for collective action in local languages for children seeking justice and redress. Develop and promote digital literacy and children’s data privacy in the curricula. Consult children’s views in developing policies that better protect their interests in online educational settings.
For education technology companies: Provide redress when children’s rights have been put at risk or infringed through their data practices. Provide privacy policies written in clear, child-friendly, and age-appropriate language. Respect and promote children’s rights in the development, operation, distribution, and marketing of EdTech products and services. Provide children and their caregivers with child-friendly mechanisms to report and seek redress for rights abuses when they occur.
For AdTech companies: Identify all child data received through tracking technologies owned by tech companies and take steps to delete it quickly to ensure it is not processed, shared, or used. Prevent the use of tracking by technology companies to monitor children or any user of these services aimed at children. Develop and implement effective processes to detect and prevent commercial use of children’s data collected by the tracking technologies of these companies.
The efforts of Common Sense serve as a guide on the subject. This organization rates movies, TV shows, podcasts, books, and other content so that kids, families, and communities have entertainment and technology options that have been submitted for assessment. Their work is to highlight technology-related legislation, recognize solutions that protect consumer privacy, find better connectivity for students and families, and hold technology companies accountable to ensure healthy internet use. In addition, they have various programs that help educators to empower their students.
Although there are many recommendations for users and decision-makers, it is vital to have fundamental knowledge about educational technologies to develop guidelines to govern these tools. The data privacy policies must be emphasized and form an essential part of the user agreements, describing personal data collection, how it is stored, and the students’ privacy defenses. What measures do you consider necessary to protect digital rights?
Translation by Daniel Wetta