Reading Audrey Watters: A reflection on personalised learning via education technology through a decolonial lens

Reading Time: 6 minutes

The blogpost was written by Moizza Binat Sarwar for the EdTech Hub blog. The blogpost is available at the EdTech Hub under Creative Commons Attribution 4.0 International. It is reposted here without any modifications.

At EdTech Hub, we’ve been reflecting on how coloniality is embedded in the work we do: from the colonial roots of the international development sector, to colonial practices embedded in research methods, to “core-to-periphery” design and deployment of EdTech interventions. We’ve just begun this journey, but in trying to embrace one of our EdTech Hub values of ‘fearless, humble learning’ we wanted to think out loud with you. This is the third in a long-form series exploring what it means to strive toward ‘Decolonising EdTech’. Thanks to Taskeen Adam for the conversation and comments. [You can find blog one and two here.]

The main objective of Watters’ book Teaching Machines: The history of personalised learning (2021) is to correct any misconceptions that EdTech today is ‘new’ and ‘shiny’ instead of an idea associated with technology used for imparting education as early as the 1890s. Drawing on Watter’s book and her ‘Hack Education’ blog, we are reflecting on elements of Watters’ historical take on personalised learning —  one specific aspect of EdTech —  and sharing five decolonial reflections on the current form and landscape of EdTech. We’re thinking out loud about how EdTech designs, products and implementations can assist in replicating features of colonial power and extraction if not consciously addressed. 

  1. Technology as the saviour to a broken education system

Watters illustrates that developers, sellers, and proponents within EdTech often highlight the sector’s ability to rescue education from crises (across different points of time, e.g., the 1960s, 1970s, and the current period) by investing in hardware and software. This is notwithstanding the cost of hardware to schools and individual learners, which Watters highlights as one significant reason why Sidney Pressey’s and B.F Skinner’s teaching machines did not take off in the 1950s/1960s and remains a persistent issue blocking equitable access and use.  Today, one core ‘sell’ for designers of education technologies is its ability to provide different forms of digital personalised learning (DPL) to solve the learning crisis for students in low-income countries and for students who are low-performance learners. DPL is understood to provide adaptive learning (where technological products adapt to the learner’s ability) and provide incremental instruction allowing the learner to move at their own pace through their own learning pathways. While DPL focuses on individual learning paths, it overlooks the systemic issues that — more often than not — play a bigger role in limiting effective learning. Thus, while some evidence on the effectiveness of DPL in low- and middle-income countries to improve learning outcomes has shown marginal gains, the main concern is the opportunity cost of investing in DPL. By focusing on DPL without addressing historical and present-day systemic injustices that limit education systems from functioning better, DPL may be a stop-gap solution.

  1. Behaviourism as the underpinning learning theory of personalised learning

Watters’ book and its discussion of Skinner is a helpful reminder of the learning theory he created, which underpins personalised learning: behaviourism. As a learning theory, behaviourism has been critiqued for being antithetical to the goals of education — i.e., the ability to engage critically with the world, analyse and comprehend complex material and build the capacity for self-directed learning. Behaviourism casts students as passive and teachers as information givers (with repetition and positive reinforcement around correct responses). The term ‘personalised’ in DPL is therefore related to how the process of learning takes place, i.e. by experiencing the results of the collection and analysis of data about learners, based on their responses to a learning programme. The absence of critical thinking, analytical and comprehension skills in students’ pathways to learning has been embedded in the role of teaching machines even before learning was taken online both in the curriculum and the method.  Watters describes the efforts of the Student Non-Violent Coordinating Committee (SNCC) in the US to apply the teaching machines of the 1960s built on personalised learning to adult literacy and improve education access and outcomes for the historically marginalised Black population in Mississippi in the 1960s. SNCC activists were interested in exploring the role of personalised learning in the Freedom Schools, “a network of alternative education centres that offered the kind of teaching and learning that the public school system of Mississippi had refused to provide to its Black population — an education that combined both intellectual and political development and one that expressly linked knowledge with power” (p. 226). Unsurprisingly, personalised learning delivered by Skinner’s teaching machines was assessed by the SNCC to be antithetical to the aims and objectives of the education of Freedom Schools, in which teachers and students co-create knowledge and imagine a world where they were free citizens. The learning method of the machines — repetition of a pre-established correct answer — was seen to promote conformity to the status quo that was deeply violent to the Black population. 

3. Lack of user-led development of digital personalised learning

In one of her blogs, Watters’ highlights a concern in line with commentators about the monochromatic nature  of education science: ‘Too often, technologies match the worldview of the designer. Clearly, there’s a large and insatiable market of middle class, well-educated white men who are disgusted by their legacy LMS [learning management systems], but it doesn’t mean they’ve all redesigned it to provide the teaching or the learning that matters, or have used universal design principles’. The development of DPL products by groups that live in a different reality than the learners creates a dissonance that curtails efforts to interrogate and move beyond narrow conceptions of what quality education looks like. Personnel who create the algorithms in DPL and those who create and/or curate the content of learning material are often a homogenous group of experts trained in the same worldviews (often regardless of their country of origin), belonging, and working out of, institutions wedded to niche bits of knowledge that are Western-centric. While ostensibly creating content for a globally diverse student body, the purported user group is often underrepresented in the creator group. Consequently, EdTech is neither ‘borderless, gender-blind, race-blind or class-blind’. The framing of the curriculum, as well as automated delivery in the mode that brooks no discussion, marginalises the knowledge of students — both indigenous and experiential — and reinforces existing epistemic injustices. A mainstream example of this marginalisation is the dominance of English — specifically Anglophone English — as a language necessary and required to navigate the majority of digital spaces and as a core component of digital literacy that is a precursor for effective use of EdTech

4. Proprietary algorithms inaccessible to users

The rights and restrictions with which the tools are made available to students, teachers, and schools are also an illustration of unequal power relations. Watters has often written about pedagogy built into EdTech products which leads to outsourcing a key component of the learning pathway to a company. A mild counterweight to the embedded removal of agency in EdTech software/hardware would have been the role of indigenous educators (teachers and academics), parents, community members, and school administrators, and engaging them in the choices around how an algorithm makes decisions and the framing and substance of the curriculum. However, tech companies often make the software and/or algorithm they use proprietary, so key stakeholders who deliver and experience these products do not get to see how the software ‘learns,’ how its decisions are made around which pathway a student will learn best, or the way feedback is given to students. Further, given negligible options for co-creation of content, neither is open-source material universally accessible nor adaptable. The one-way relationship EdTech companies often create with implementers and actual users of EdTech (students, teachers, parents) replicates colonial relationships of subservience and is part of neo-colonial encounters where large companies have the same standing and power as states in the life of citizens.  

5. Conditions and inherent coercion within EdTech products 

Companies argue that teaching machines/software/hardware tend to get better the more data they access and collect. Their defence is that the data enables them to meet the needs of students and hence the data-extraction relationship benefits both parties. The reliance and ideological emphasis on data alone as a source for good pedagogy is rooted in behaviourism and operant conditioning (which Watters identifies as the ‘educational theory’ behind teaching machines). It is distanced from recognition that human beings develop knowledge through interactions and observation of human beings as embedded in theories of education with roots in constructivism and social learning. Indeed, as Watters notes in her popular blog on education technology ‘Hack Education’, personalised and individualised learning has always meant data collection and analysis, and it may be more accurate to think of ‘predictive’ learning rather than personalisation. To access EdTech products, students may have to allow their personal data — ranging from name, age, gender, location, facial identifiers, phone numbers, parents’ background, health indicators to name a few — to be used by the software in line with the creator’s own policies, which are often not up for negotiation by the student. Mainstream critiques of such data mining (e.g., the limited impact on improving pedagogy, the inability to protect data from attacks or use outside the education system) aside, a key concern is service providers holding users hostage in deeply unjust circumstances. If EdTech is indeed to provide education to marginalised groups such as refugees, to be a short-term replacement for in-school education in the aftermath of a disaster or to provide learning in remote areas where school infrastructure is weak, then, the choice to opt-out of data collection at the cost of losing access to the product,  is a false choice. The ability to opt-out is a choice only if alternative and high-quality systems of education exist. In these circumstances, EdTech may become a tool of exploitation of a valuable resource — student data — given up under duress.

Education technology is a wide and diverse field with a variety of products. While at the surface level, EdTech and personalised learning products can appear harmless and beneficial, when viewed through a decolonial lens on the underpinning philosophies, the design process and the uses of data can reinforce inequitable education systems.  Through explicitly seeking to overcome these challenges, EdTech and DPL can provide opportunities for just and liberating education systems. 

The blogpost was written by Moizza Binat Sarwar for the EdTech Hub blog. The blogpost is available at the EdTech Hub under Creative Commons Attribution 4.0 International. It is reposted here without any modifications.

Image by Alexander Lesnitsky in 2016. The image is available on Pixabay.

Moizza BINat sarwar