Analysis of 200 education dept-endorsed school apps finds most are selling BS when it comes to the privacy of children’s data
9 mins read

Analysis of 200 education dept-endorsed school apps finds most are selling BS when it comes to the privacy of children’s data

A groundbreaking study by UNSW researchers has revealed alarming vulnerabilities in the digital tools adopted by Australian schools, exposing a widespread and immediate risk to the privacy and security of underage users. The analysis of nearly 200 educational applications, commonly recommended by schools and education departments, found that a vast majority begin collecting sensitive student data almost instantaneously upon launch, often in direct violation of their own stated privacy commitments. This practice leaves children, who are increasingly reliant on digital learning, susceptible to significant privacy breaches and potential security threats.

The comprehensive audit, conducted by UNSW’s cyber security experts, examined a broad spectrum of Android educational apps. These applications were sourced from official school recommendation lists, state Department of Education websites, and the widely used Google Play Store, representing a significant portion of the digital ecosystem used in Australian classrooms. The findings, detailed in the paper "Analysing Privacy Risks in Children’s Educational Apps in Australia," authored by Dr. Rahat Masood and colleagues Sicheng Jin, Jung-Sook Lee, and Hye-Young (Helen) Paik, paint a stark picture of the digital landscape facing young learners.

The Illusion of Safety: Deceptive Branding Masks Data Harvesting

A central and deeply concerning finding of the research is the prevalence of what the authors term "the illusion of safety." Many apps, particularly those specifically marketed towards young children with labels such as "Kids," "Preschool," or "ABC," were found to offer no greater privacy protection than general-audience applications, and in some cases, were demonstrably worse. This deceptive branding cultivates a false sense of security among parents and educators, who may reasonably assume that apps specifically designed for children have undergone rigorous privacy vetting.

The study revealed that a staggering 76% of apps targeted at children exhibited at least one form of policy distortion, a figure that surpasses the 67% observed in general educational titles. This distortion manifests as a significant gap between the app’s stated privacy commitments and its actual data collection and sharing practices. Far from being benign educational aids, these apps often embed the same sophisticated advertising and analytics tools commonly found in commercial entertainment applications. These tools, designed to track adult internet users for targeted advertising and behavioral analysis, are now being deployed on young, impressionable minds without adequate consent or understanding.

Dr. Masood highlighted the insidious nature of this practice: "Child-centric branding cultivates parental trust without providing genuine protection. Parents are led to believe these apps are safe havens for their children, when in reality, they are often conduits for extensive data collection." The research team found that many of these applications not only collected sensitive data but also transmitted it to third parties, often through complex and deliberately opaque privacy policies that render them virtually incomprehensible to the average parent.

Immediate Data Transmission: A Breach from the Moment of Opening

The research team’s dynamic analysis of the apps uncovered an even more immediate and alarming concern: the rapid initiation of data collection. An overwhelming 89.3% of the audited apps began transmitting data to third parties before a user had even interacted with the application. Simply opening an app was sufficient to trigger the transmission of device identifiers, location metadata, and other sensitive information to various analytics platforms and advertising networks.

"Even if you are not interacting with the app – you just open it and that’s it – it is still transferring lots of data," explained Dr. Masood. "Telemetry data, which mainly refers to tracker-related identifiers used for the automatic collection and transmission of data to remote servers. Despite just opening the app and not using any educational feature, it is still transferring a lot of information that is sensitive and can actually identify your device." This "idle telemetry" means that children are being tracked and profiled from the very first moment they engage with these supposedly educational tools, often without any explicit consent.

This immediate data transmission is particularly concerning given the governmental stance on protecting young people. Australia has implemented measures, such as the ban on children under 16 using social media, specifically due to concerns about how tech companies target young users. The UNSW findings suggest that the digital tools being actively used within schools may be circumventing these protective measures by operating under a guise of educational necessity.

API Vulnerabilities and Hard-Coded Secrets: A Developer Oversight

Beyond privacy concerns, the study also identified significant security flaws within the educational apps. A substantial 79.4% of the applications contained "hard-coded secrets." This refers to Application Programming Interface (API) keys and credentials that are embedded directly into the app’s code. Such a practice is considered a major security vulnerability, as it allows anyone with the technical capability to decompile the application to access these sensitive keys.

Analysis of 200 education dept-endorsed school apps finds most are selling BS when it comes to the privacy of children’s data

"Hard-coded secrets mean that if you configure an API, you have a password or passphrase, and the API key is hard-coded within the code," Dr. Masood elaborated. "Anyone can access it and do whatever they want with the API. It is not a good practice from a development point of view." This oversight could potentially grant unauthorized access to backend systems, compromise user data, or enable malicious actors to exploit the app’s functionality.

Opaque Privacy Policies: A Barrier to Parental Understanding

Adding to the complexity of the privacy landscape, the research team found that the vast majority of the apps’ privacy policies were virtually indecipherable to the average user. A mere 3% of the policies were assessed as "fairly easy" to read. The remaining 97% demanded a university-level education or higher to comprehend, laden with technical jargon and legalistic phrasing.

"Nobody will understand these terminologies and jargon," Dr. Masood stated. "Comprehension, readability, understandability – all these metrics that we analysed were all very bad." This deliberate obfuscation of information effectively shields developers from accountability, as parents are unlikely to understand what data is being collected, why, or with whom it is being shared.

The disconnect between stated privacy and actual behavior was further underscored by the finding that only a quarter of the apps examined demonstrated consistency between their privacy policies and their observed data collection practices during testing. Many apps made claims of "Data Not Collected" or "no ads, no tracking" in their store descriptions, yet were found to be actively initializing analytics platforms and transmitting persistent identifiers from the moment they were launched. This suggests a potential misuse of AI tools in generating privacy policies that do not accurately reflect the app’s functionality, or a deliberate attempt to mislead users.

Broader Context: Government Initiatives and Ignored Warnings

The UNSW findings arrive at a critical juncture, as governments grapple with the implications of digital technology on children. Australia’s own privacy commissioner had previously raised concerns about data protection during the trials leading up to the ban on social media use for those under 16. However, these warnings appear to have been largely disregarded in the final reports and subsequent policy decisions. The Office of the Australian Information Commissioner (OAIC) had noted that the Age Assurance Technology Trial (AATT) reports used "inflated privacy language" not supported by the trial’s methodology, and that a comprehensive privacy assessment against the Privacy Act had not been conducted as proposed.

The UNSW study suggests that this broad interpretation of privacy extends to the assessment of apps endorsed by educational bodies. The reliance on superficial quality assurance frameworks, which do not involve in-depth technical analysis or dynamic testing, leaves schools and parents vulnerable. Teachers, often resource-constrained and unaware of the sophisticated data harvesting techniques employed by these apps, rely on these endorsed lists, assuming them to be safe. Parents, in turn, trust that school-approved applications have been thoroughly vetted.

Recommendations for a Safer Digital Future

In light of these alarming findings, the UNSW researchers are advocating for a significant overhaul of how educational apps are vetted and regulated. They propose the implementation of a "traffic light" system, providing a clear and immediate visual summary of an app’s privacy and security profile, bypassing complex legal jargon.

Furthermore, the research calls for stricter oversight of the "child-directed" app category. Labels such as "Kids" or "Educational" should be underpinned by a verified technical baseline, rather than serving merely as content descriptors. The researchers also urge regulators to prohibit "idle telemetry" – the practice of transmitting data before any user interaction has occurred.

The project, funded by the UNSW Australian Human Rights Institute, underscores the urgent need for greater transparency, accountability, and robust technical scrutiny in the development and deployment of educational technologies. Without immediate action, the digital tools intended to enhance learning risk becoming pervasive instruments of data exploitation, jeopardizing the privacy and security of Australia’s youngest generation. The implications are far-reaching, potentially impacting children’s digital footprints, future opportunities, and overall well-being in an increasingly data-driven world. A coordinated effort involving developers, educational institutions, and regulatory bodies is paramount to ensure that technology truly serves the best interests of children.

Leave a Reply

Your email address will not be published. Required fields are marked *