Ben Williamson
Image: Jorge Franganillo
Over the last decade or so, researchers of education policy have focused much attention on the role of numerical data in the ways that schools, teachers, and whole education systems are governed and managed. Less attention has been concentrated on the digital technologies that facilitate the flow of such numbers. A significant output of the Code Acts in Education project is a new special issue of the European Educational Research Journal on the topic of ‘Digital Education Governance.’ The eight original research articles contained in the issue detail some of the ways in which the numerical management of educational institutions, processes and practices increasingly depends on digital technologies and related forms of technical expertise that have so far been the subject of little detailed attention.
Governing by numbers
Of course, the use of numbers in major domains of government, such as education, crime, health, trade, taxation, and so on, is nothing new. As Nikolas Rose argued in his influential work on government and numbers in Powers of Freedom:
The organization of political life in the form of the modern ‘governmental’ state has been intrinsically linked to the composition of networks of numbers connecting those exercising political power with the persons, processes and problems that they seek to govern. Numbers are integral to the problematizations that shape what is to be governed, to the programmes that seek to give effect to government and to the unrelenting evaluation of the performance of government that characterizes modern political culture.
In other words, numbers are used to define the problems that governments might seek to solve, to inform the projects devised to do so, and to the evaluation of interventions done by or on behalf of governments. Moreover, numbers are understood by Rose as an essential feature of the knowledge required to govern, allowing sectors such as education, public health, trade and crime to be known and therefore acted upon, since ‘there can be no well-ordered political machinery’ without a statistical knowledge of the state of the population, and the numbering of persons, goods, activities and so on.
Statistical systems such as databases of information about populations are thus a major technique of government. As Geoffrey Bowker has noted, the governmental operations of the state and the functional logics of databases are symmetrical:
A good citizen of the modern state is a citizen who can well be counted—along numerous dimensions, on demand. We live in a regime of countability with a particular spirit of quantification … [and] governmentality: a modern state needs to conjure its citizens into such a form that they can be enumerated.
Database architectures are not only a technical presence, but a key element in the governmental techniques of the contemporary social environment. How individuals are to be captured, counted, and classified in database systems is an evolution of a particular style of governing through quantification, calculation and classification that emerged in the statistical and bureaucratic revolution of nineteenth-century record-keeping (‘first wave big data,’ as Meg Leta Ambrose terms it) but is now amplified to real-time monitoring of the population through digital big data.
Governmentalizing big data
With the recent rise of big data, much has been made of the potential for government departments and agencies to make use of the administrative data it holds about citizens. This is reflected, for example, in the establishment of the Administrative Data Research Network, a data intermediary that provides access to government data and potentially allows the user to track data on individuals across their entire lifetime—from birth records, through educational data, health, crime and employment details, to data on deaths—as well as to access data that might be used to inform policymaking.
Big data also appears to make it possible for governments to work with data analytics agencies to monitor its citizens through their digital traces. The Royal Statistical Society has produced a ‘Data Manifesto’ to encourage government to use sources of both big data and open data in new forms of data-informed policymaking, while a ‘Data for Policy’ conference in 2015 explored the opportunities and challenges of ‘Policy-making in the Big Data Era.’ Along these lines, Nesta and the Cabinet Office are collaboratively working on ideas for a ‘new operating model for govermment,’ by exploring in practice how emerging technologies such as ‘data science, predictive analytics, artificial intelligence, sensors, applied programming interfaces, autonomous machines, and platforms’ might in the next five years become ‘ingrained into how government thinks of itself,’ ‘redefine the role of government, and even create a different relationship between state and public.’ The first step in this direction has been the establishment of the ‘Government as a Platform’ initiative by the Government Digital Service.
The political scientists Patrick Dunleavy and Helen Margetts have defined such approaches as ‘digital governance,’ an approach to redesigning ‘the state in the era of social media, cloud computing, robotization and big data,’ and argued that ‘many or even most government departments and agencies “are” their information systems and digital presence—the only part of them with which many citizens will interact’:
Governments and citizens operate in a digital environment, leaving digital trails whatever they do and wherever they go. These trails generate huge quantities of information about themselves, each other and any interactions they have. In this context … most governments in the 21st century industrialised world and beyond are reliant on a large digital presence and complex network of large-scale information systems for administrative operations and policy-making.
In this reformatory model, governments are increasingly seeking to ‘prioritise the interactions and feedback loops that would capitalise on the potential of the internet and digital technologies for public policy solutions and service delivery.’ Digital governance, they argue, is a response to technological developments such as massive databases and networked communications platforms, and it is characterized by new automated ‘zero-touch’ technologies; behavioural policy and persuasive ‘nudge’ technologies; big data analysis; database-led information processing; digital by default public service transactions and interactions; real-time government data-pooling; network-based communications; open data; and online government.
These developments promise to accelerate and transform the existing techniques of governing by numbers, by making new sources of digital data into sources of numerical knowledge, and by giving authority for the collection and analysis of those data to new sources of expertise such as well-resourced data analytics labs in independent agencies and commercial settings. In other words, in digital governance, big data software has been governmentalized, although, as Dunleavy and Margetts acknowledge, the ‘online worlds of governments and citizens are surprisingly separate’ as various barriers remain within government ‘to using social media and embracing the digital timestream, and developing the data science skills necessary to extract public value from big data.’
Digital education governance
Although digital governance remains an emerging form of government practice, within education specifically the current embrace of various forms of digital and big data is indicative of a contemporary shift towards the digital governance of education. In this model, authorities engage with the ‘digital timestream’ of educational data as it is generated, and then mobilize the data extracted from it for the purposes of improved decision-making in relation to policy, institutional management, or even pedagogic practice within classrooms. The ‘digital education governance’ special issue of the European Educational Research Journal identifies some of the emerging technologies and practices of digital governance in education.
In many ways, the special issue can be read as a response to Martin Lawn’s claim in The Rise of Data in Education Systems that while the history of educational governance can be traced through the statistical infrastructures of data collection, calculation and communication that originated in nineteenth-century administrative systems, today the ‘hidden managers’ of educational systems are the software products, data servers and analysis packages that make educational data amenable to being collected, visualized, and used in a variety of ways. As I phrase it in my introduction to the issue:
Contemporary education is increasingly organized through a densely networked apparatus of computer code, algorithms, database infrastructures, architectures, servers, platforms and packages; it is managed through new data analytics and other digital platforms that enable the collection, cleaning and connection of data; it is mediated through websites, data visualizations and graphical forms of communication; it is peopled by new kinds of experts in digital data analysis, knowledge production, and presentation; and it is located in particular institutions, organizations and communities with their own technical ways of doing things, scientific styles of thinking, professional subjectivities, and objectives and aspirations. Digital software technologies, data systems and the code and algorithms that enact them have become powerful yet largely hidden influences in the governing of education.
Likewise, as Tom Liam Lynch notes in his book The Hidden Role of Software in Education, a new kind of ‘software space’ made of code, algorithms and data produced by commercial actors, programmers and analysts is nowadays working alongside the ‘political space’ of educational governance, then exerting its influence on the ‘practice space’ of the classroom.The aim of the original articles collected in the special issue of European Educational Research Journal is to bring into the foreground the digital technologies, software packages, database platforms, and related forms of technical expertise involved in the rise of digital educational governance. The articles work with conceptual resources and methods that cut across policy studies, digital sociology, software studies, governance and governmentalities, and actor-network theory, to begin considering the digital layer of contemporary techniques and practices of educational governance.
In the first full paper, Manuel Souto-Otero and Roser Beneito-Montagut provide a wide-ranging synthetic overview of the technical artefacts that constitute the ‘digital turn’ in education governance. Such artefacts are the product of governments, private companies, and educational establishments that are, in different ways, shaping data consumption and production in education. They argue that social actors’ capacities to participate in and deal with the automated interrogation of educational data are leading to diverse strategies of ‘alignment’ as well as ‘resistance,’ ‘gaming’ and ‘rebellion.’
The role of commercial companies with vast R&D resources to develop data analysis technologies and infrastructures is a major factor in techniques of digital education governance. Ben Williamson examines the ‘digital methods’ of Pearson plc: the digitized techniques of data collection, calculation and communication it is now mobilizing in making educational institutions, practices and people amenable to measurement and evaluation. He argues that for Pearson it is the computational affordances of ‘data science’—including data analytics, pattern recognition, data visualization, human-computer interaction, and predictive machine learning techniques—that promise to produce the knowledge through which education and learning are to be understood and acted upon in the data scientific twenty first century. Its investment in such techniques, technologies, and related forms of technical expertise appear to make Pearson into a dominant source of research and knowledge in contemporary education, potentially displacing the legitimate authority of the social sciences in the production of knowledge about education.
In contrast, Neil Selwyn then analyses many of the more mundane and everyday ways in which digital data work is being conducted within schools and enacted by school leaders, managers, administrators and teachers. Drawing on extensive ethnography of data work in schools in Australia, he argues that alongside the digitized data regimes associated with state and federal governments, are smaller-scale accountability procedures and practices initiated ‘in-house’ by school managers and/or teaching staff. While digital technologies are clearly reinforcing wider trends in educational managerialism, Selwyn also considers the subtle ways that local enactments of such governance are shaped by schools’ relatively unsophisticated data processing technologies and techniques.
While in-house data analysis is expanding in schools, external agencies continue to generate and circulate much of the data used to govern them. Jenny Ozga demonstrates how the embodied work of school inspection in the UK—as well as elsewhere in Europe—is now increasingly being displaced to digitized data technologies. School inspection technologies and websites such as school data dashboards, Ozga argues, produce new kinds of governing knowledge that is rendered in calculable numbers, graphs and interactive visualizations. As such, the frameworks that govern how schools are inspected are now coded into software devices produced by commercial contractors and competitive agencies, which then enact algorithmic processes of measuring, categorizing, classifying, sorting, ordering and ranking in ways that displace the embodied professional judgment of school inspectors to coded surveillance instruments.
Data-based techniques of surveillance and cybersecurity are the focus for the article by Nelli Piattoeva, who discusses the effects of digitization, scientization, and datafication of education policy in the specific context of the Russian Federation. In particular, she analyses the recent introduction of obligatory video surveillance equipment during public examinations in Russia, and argues that these instruments turn sites of public examination into sites of numerical data production, coercing schools and individual test-takers to become docile data producers.
Cormac O’Keeffe provides a forensic examination of the entanglements of human and nonhuman actors that enact the OECD’s PIAAC survey of adult literacy. His investigation employs a novel digital methodology known as ‘trace ethnography’ to examine the detail of e-assessment events, especially focusing on the interactions between coded technologies, algorithms and people and how these are translated into statements about what it means to be a literate adult learner. This in turn highlights the role of nongovernmental organizations in influencing educational and economic policy-making through the intensification of data production.
In the following paper, Richard Edwards and Tara Fenwick focus on the ‘smart’ technologies that are increasingly intervening in professional learning. They argue that the interplay of code, algorithms and big data are increasingly pervasive in the governing, leadership and practices of different professional groups, reshaping the relationships between professional groupings and between professionals and their clients/users/students.
Finally, Matthias Decuypere and Maarten Simons demonstrate how digitally coded technologies are shaping the role of academic practice itself within Higher Education institutions. Through painstaking ethnography, they specifically examine the role of the computer screen in the daily composition of academic practice, drawing attention to how screens perform different functions in academic work, and especially to the capacity of screens to ‘script’ academic settings and make actors in university settings ‘do particular things’ in lecture halls, offices and seminar rooms, in ways that (sometimes) materialize processes of increased bureaucratization, accountability and marketization in Higher Education.
Together, the article speaks to the embeddedness of educational practices of many kinds in highly coded settings, where the values, worldviews and aspirations of programmers, algorithm designers, human-computer interaction designers and data scientists work from a distance to govern the ways in which educational institutions, matters and persons are known and acted upon. The term digital education governance, then, registers the displacement of educational governance to new digitized sites of expertise in data collection and analysis, and also acknowledges the role of digital software, code, algorithms in governing and guiding the conduct of diverse educational actors and institutions. Increasingly, education is being governed through the knowledge gained from its digital trails and timestreams, and through the new forms of technical expertise such as data science and data analytics that make the production of knowledge about educational problems and the specification of their solutions possible.