Of the propaganda that generally swirls around public education, none is more prominent than the claim that technological “personalized learning” can transform learning. Ed-tech companies trumpet how “innovative” and “engaging” their products are, perfect for efficiently training the 21st-century drones needed by the global economy. Buy our digital devices for your students, they urge, and we’ll throw in the apps for free! Act now! Don’t be left behind!
Not so fast. University of Colorado researchers recently released a report analyzing the myriad problems with “personalized learning” schemes. Every legislator, education professional, and parent should read this report and slam on the brakes.
Asleep at the Switch: Schoolhouse Commercialism, Student Privacy, and the Failure of Policymaking focuses on “how technological advances, the lure of ‘personalization,’ and lax regulation foster the collection of personal data and overwhelm efforts to protect children’s privacy.” It examines how ed-tech companies are flooding schools with untested and unvalidated products, targeting children with sophisticated marketing and urging or requiring schools to “effectively funnel children into a ‘surveillance economy’ that harvests their data for profit.” In short, students’ privacy is being obliterated and their education warped.
The report explains how education technology intensifies corporate marketing by creating consumer profiles on students. The enormous amounts of data collected as students interface with the software constitute “data gold.” Even the few states that limit sale of this information allow its transfer during acquisitions and bankruptcies — everyday events in the technology world.
But beyond the marketing possibilities, this data trove threatens student privacy and education in fundamental ways. As we’ve written, by analyzing keystrokes, time spent on various computerized tasks, and even physiological data such as facial expression and heart rate, such software can assemble a picture of how the child’s brain works and how he addresses various situations. Tech gurus then feed this data into algorithms — defined by the Colorado researchers as “theories that reflect which pieces of information the algorithms’ authors consider valuable and how their authors believe those pieces should be used to draw conclusions.” Those algorithms predict a student’s behavior, such as dropping out of school or succeeding or failing in certain courses. This is what “personalized learning” means.
The report identifies several pitfalls of this algorithmic world. One is that algorithms are created by human beings who inevitably inject their own preferences and biases. So the algorithms may qualify as “garbage out.”
The second problem is lack of transparency. The process is “proprietary,” which means no third party can verify the validity and reliability of the algorithm. Trust us, the creative geeks say, because we’re super smart. But the smart guys don’t tell customers (the schools) that they haven’t yet figured out how to code in perceptions and reactions that human teachers would account for. Even so, when used by colleges and potential employers, these sketchy algorithms could dictate our children’s futures.
A third problem is that “most companies do not field test products before shipping them to schools, nor do they conduct significant research to validate their claims.” Why should they? Schools fall for the slick marketing, and the students become the (unpaid and unprotected) guinea pigs.
Yet another problem is data security. The report lists sobering statistics about the lax protections offered by tech companies; for example, only 46 of 152 privacy policies that were studied reported using encryption to protect the data. Technology entrepreneur Maciej Ceglowski is quoted as warning: “Even if you trust everyone spying on you right now, the data they’re collecting will eventually be bought or stolen by people who scare you. We have no ability to secure large data collections over time.”
The report then asks some fundamental questions about the ed-tech culture. Should schools be immersing children in the “surveillance economy,” teaching them to accept constant monitoring and sharing of their most personal data? That this already happens outside school doesn’t excuse schools’ participation. And should education really be reduced simply to job-training for students, via “digital badges” and other data-collection that will be shared with various “stakeholders” who wonder what the mystery algorithms say about a particular kid?
The report ends with recommendations for regulatory action. These include requiring algorithms powering education software “to be openly available for examination by educators and researchers” and to be analyzed by independent third parties for bias and validity.
But the report doesn’t suggest the most fundamental check on this dangerous software — full explanation to parents of how the algorithms work, what data they collect, and how it will be used and shared — and then requiring schools to let parents choose if their children will participate. Parents have a fundamental right to be told about any threat to their children and to protect them from it, even if computer guys and educrats think it’s a terrific innovation.
The Colorado researchers have exposed what qualifies as an education scandal. Their report deserves consideration in every legislature and school in the country.
Source: http://bit.ly/2y7cQ9o