The production and use of data in educational policy is a contentious issue. In recent years, massive international organizations such as the Organization for Economic Cooperation and Development (OECD) and national governments alike have become key producers and users of data. Yet, to date, relatively little has been written about the digital technologies that make it possible to conduct educational data analyses, or about the growth and technical expertise of ‘independent’ research centres and ‘policy labs’ where educational data analysis is increasingly taking place. These organizations are prototyping a ‘new operating system for government‘ within education, promising to install increasingly advanced forms of data science methods, predictive analytics, and even machine intelligence in the policy process.
Image credit: Sebastiaan ter Burg
The collection, calculation and visualization of educational data has a long history, going back at least as far as the spectacular exhibitionary practices of the nineteenth-century international expositions, world’s fairs and scientific congresses. More recently, with the global growth of data systems, educational researchers have begun to query the ways in which data is used in the policy making process. Researchers of ‘policy by numbers’ have sought to illuminate how data are increasingly being used as the knowledge required for governing education by national government education departments as well as by other international organizations such as the OECD.
The key issue for such researchers is that of comparison. Through data, schools and national education systems are made comparable in order to allow evaluations to be attached to their different performances. Standardized testing has been a key technique for generating comparable data; so too has the strategy of ‘like-school’ comparisons, where schools with similar demographic profiles and cultural and financial resources, or statistically similar populations, are compared—requiring the production of new techniques of measuring external factors such as social and cultural advantage—in order to then isolate and compare the specific internal school factors that affect performance.
Digital policy instruments
Today, though, educational data is increasingly produced and presented through forms of technical expertise and digital devices that make the power of the numbers even more visible, reproducible and persuasive. For example, The Learning Curve databank, a vast database of over 60 global datasets collected together by the global commercial company Pearson Education, consists of heatmaps, time series tools, and country comparison graphics devices, all combining to produce a ‘Global Index’ of nations that is ranked in terms of ‘educational attainment’ and ‘cognitive skills’. The Learning Curve demonstrates how educational systems are being mirrored by a digitally-rendered, graphical landscape in which the data has been mediated into a variety of diagrams, charts, tables, infographics and other forms of representation that make education intelligible to a wide variety of audiences. It is a powerful technique of political visualization for envisioning the educational landscape, and for operationalizing the presentation and re-presentation of numbers for a variety of purposes, users and audiences.
There are plenty of other examples of how educational data is being digitized in order to be packaged up for different purposes. The OECD’s Education GPS website allows users to compile the data held by the OECD from its extensive PISA datasets in order to ‘create your own, customized country reports, highlighting the facts, developments and outcomes of your choice,’ and to compare and review different countries’ educational policies. The commercial educational supplier Research Machines has produced School Finder, a simplified website that it allows parents to search schools in specific geographical areas, to compare those schools according to various data, and then to shortlist their preferred school choices. School Finder is the MoneySupermarket of school choice. And government agencies such as the Department for Education and Ofsted also increasingly emphasize the digital presentation of school data, particularly through ‘School Data Dashboards’ that enable school leaders and data managers to stage their data publicly. These sites subtly encourage the user—whether policy maker, parent or school leader—to participate in the presentation and comparison of schools and national education systems through numerical and graphical displays.
A useful way to conceptualize all of these technologies of data collection, calculation and visualization is that they are ‘digital policy instruments.’ The political sociologists Pierre Lascoumes and Patrick le Gales have argued that ‘policy instruments’ are those techniques and devices that make policies operational and material. But instruments are not politically neutral, innocent devices: they reflect the values and assumptions built-in to them in the process of their production. Understood as a policy instrument, standardized testing, for example, is clearly not an apolitical device. It is the result of political debates, methodological decisions, and the values attached to statistical analysis. To fully buy into standardized testing, you have to buy into a realist epistemology that numbers can represent the world as statistical facts, and subscribe to the methodological commitment that data-based scientific methods are best suited to surfacing that reality. With the increasing digitization of educational data (both in its production and presentation), websites such as the Learning Curve, Education GPS, School Data Dashboards and School Finder might then be seen as digital policy instruments that enable the ‘reality’ of schooling to be presented as ‘visualized facts.’ These sites are underpinned by the production of educational data through methods derived from the field of data science. And by visualizing schools, or national education systems, in such a way then makes them amenable to being seen, known and acted upon accordingly.
Yet the technical expertise required to produce educational data visualizations, the data science techniques they mobilize, and the various software programmes and algorithms facilitating its production, clearly have a role to play in the production and presentation of education through data. It is notable, for example, that the technical and methodological development of Pearson’s Learning Curve was performed by the Economist Intelligence Unit, an organization peopled by economics and statistics experts, whose methods tend to be largely quantitative and data-based. It specializes particularly in country analysis and profiling, and on regional forecasting for the global economy. Knowing where the data are collected, cleaned up, calculated and then represented, the scientific techniques through which those activities are performed, and the technical experts doing that work, is essential to understanding how digital policy instruments might then carry operational force as they shape particular interpretations into ‘actionable insights.’
While the OECD, DfE, Ofsted and Pearson are highly visible data actors and producers of policy instruments in education, an intriguing development in this emerging policy field is the ‘policy innovation lab.’ Policy innovation labs is the collective name I use to refer to a new breed of organizations that also go by names such as ‘social labs,’ ‘innovation teams’ (‘i-teams’), and ‘government innovation labs.’ On the social media platform Twitter, where many of these organizations have a strong presence, they trend under the hashtag ‘#psilabs’, or ‘public and social innovation labs.’ The ‘labification’ of the policy field has rapidly accelerated since 2010, with policy innovation labs ‘applying the principles of scientific labs—experiment, testing and measurement—to social issues.’
A particularly interesting example of a policy innovation lab launched in March 2015. The Education DataLab has been established as ‘the UK’s centre of excellence for quantitative research in education, providing independent, cutting-edge research to support those leading education policy and practice’:
Education Datalab brings together a group of expert researchers who all believe we can improve education policy by analysing large education datasets. We aim to turn curiosity about education into quantitative analysis: analysis that shapes the way we think about schools and colleges, teachers and learners.
To this end, the Education DataLab is mobilizing large datasets such as the National Pupil Database, which contains detailed information on over 7 million pupils matched over 12 years, along with the School Workforce Census, the Longitudinal Study of Young People in England, birth cohort studies, and international datasets such as the Programme for International Student Assessment (PISA), the Teaching and Learning International Survey (TALIS) and the Programme for the International Assessment of Adult Competencies (PIAAC), all of which are administered by the OECD.
The Education DataLab is an interesting development. It suggests that educational data analysis could be done in increasingly ‘independent’ research centres and labs, whose emphasis is not so much on the politics of policymaking, but on the realist view that deriving statistical facts through data science methods can provide robust, rigorous and objective insights for policymaking. Policy innovation labs act through data analysis and digital resources to promote their ideas, advice and agendas. In particular they produce new ‘governing methods’—forms of data collection, analysis and evaluation—that are intended to ‘know’ and manage educational institutions and individuals, whilst distantiating themselves from existing political contests. The instruments and methods used by policy innovation labs are presented as ‘a-political’ forms of expertise, and thus by ‘denying their own political character, they depoliticize their own roles as political players.’ But the way in which labs define the problems they focus on, and the solutions they design, are fundamentally political acts:
Labs are places where people conduct experiments to test out theories. The new labs proliferating outside the hard sciences are a symptom of the spread of experimentalism as an ideology for how we should shape the future.
The policy innovation lab methodology, in this sense, is not merely about representing the statistical facts about education, but about seeking to intervene in that reality, experiment in it, and seek to develop not just policy insights but actionable insights that might bring about change. This is, as the quote above indicates, an ideological act of shaping the future of educational institutions—whether by that we mean schools or national systems.
Labs such as the Education DataLab are emerging sites for the production of new digital policy instruments that will enable new policies to be put into practice. The Open Policy Making team at the Cabinet Office, which is responsible for the government’s own Policy Lab UK, has even suggested that in the next 5 years a ‘roll-call’ of government terms might include ‘data science, predictive analytics, artificial intelligence, sensors, applied programming interfaces, autonomous machines, and platforms.’ Labs are places where such data techniques and methods are being trialed and tested before being applied–and education has become a key testbed where these methods are piloted and new techniques prototyped.
The ‘labification’ of education policy is significant, then, in bringing particular scientific forms of methodological and technical expertise into the policy process, whilst avoiding the politics, values and ideology of conventional policymaking, and also circumventing the concern among policy sociologists about the centralization of educational data in massive international organizations such as the OECD. While researchers of educational data have already got to grips with the statistical techniques underpinning standardized testing, policy-by-numbers approaches, and so on, less has been noted about the specifically digital policy instruments that enable the data to be collected, cleaned, categorized, calculated and then graphically displayed. Even less has been said about new kinds of data actors, such as policy innovation labs, and the forms of technical expertise, digital instruments and methodological repertoires that they bring to the analysis and presentation of educational data. While claiming independence and neutrality through the apparent rigour and objectivity of data analysis, their instruments and methods need to be understood and examined as political devices that are shaping the ways educational institutions are seen, known and acted upon. As yet, we know little about the ‘laboratory life‘ inside such spaces, or about the methodological and data-science techniques they mobilize to establish the ‘scientific facts’ of current educational realities.