Recently, the growing global field of policy innovation labs, open policy making, and government labs has made ‘policy learning’ a key concern of government. A number of methodological toolkits to enable policymakers, civil servants and lab staff to learn about service users and their needs has been published. These include the Open Policy Making toolkit from Policy Lab UK, the Lab Practice methodology guide from Kennisland, the Service Innovation Handbook by Lucy Kimbell (also of Policy Lab UK), the Service Design Toolkit from the European SPIDER project, a ‘design for policy’ handbook edited by the head of MindLab, as well as others. The Open Policy Making Team in the Cabinet Office is also planning a major learning and teaching programme for civil servants to learn design thinking. From an educational point of view, there is something deeply pedagogical going on in this work.
The Civil Service Learning initiative has also published extensive documentation on the learning needs for civil servants and policy professionals , in terms of policy knowledge, policy skills, and behavioural skills. As its report states:
A policy professional sees their career, learning and development anchored around policy work and seeks to achieve the level of competence, behaviour and status that goes with being professional in their work. Like all civil servants, policy professionals share a common set of transferable behavioural skills.
The central focus for many of these toolkits and frameworks is the idea that policy professionals need to become adept at learning throughout their career in order to inform and improve their synthesis of evidence, politics and delivery in the formation of new policies. The GovLab digest provides ample evidence of the scale of policy learning required. The kind of methods of policy learning described in, for example, Policy Lab UK’s open policy making toolkit include design-based methods, user-centred research, digital methods, and data-based methods of analysis and data visualization. Policy professionals are under pressure to learn and adapt to a highly dynamic new set of digital resources and techniques; policy innovation labs are situating themselves as the educators to providing this policy learning. Their toolkits, handbooks and methodological guidance are the pedagogic resources they are using to facilitate it.
In my recent research article on policy innovation labs, I have argued that digital, data and design methods are now becoming increasingly important resources and skills in the governance of public services such as education. Emerging developments such as data analytics, social media analysis, design ethnography, behavioural insights techniques, and rapid prototyping are becoming key methods through which evidence about service users might be derived, and which can inform the learning of policy professionals. I’ve previously posted some thoughts on the politics of policy lab methods, and the need for deeper examination of how these methods are enacted, and to what ends, in specific institutional settings. All these pieces point toward the rising significance of digital, data and design-based methods in the work and learning of policy professionals, and of policy labs as the experts seeking to educate them.
In other words, policy professionals are under pressure to learn advanced methods, often using complex digital tools more familiar to the commercial social media landscape, as part of their everyday policy work. The demands of digital policy work look likely to be amplified in the next years. Nesta and the Open Policy Making team, for example, are working with the Cabinet Office on a horizon-scanning activity to explore how to mobilize emerging methods of data science, predictive analytics, artificial intelligence, sensors, applied programming interfaces, autonomous machines, and digital platforms as a new ‘operating model for government’. The new operating system for government will make great demands in terms of the policy learning, knowledge, and behavioural skills of policy professionals; policy innovation labs are seeking to facilitate the policy learning required by such approaches to governance.
Pedagogies of policy learning
My sense here is that a useful research project could inquire into the ‘pedagogies of policy learning’ implied by these developments–where by ‘pedagogy’ I mean the techniques by which knowledge, skills and values are transmitted from an authority to a learner, in this case the pedagogic authority being those policy lab organizations and individuals that seek to educate the professional policy learner. Policy researchers working in education have referred in the past to a tension between ideas about ‘policy borrowing’ and policy learning. Policy borrowing, it has been claimed, is the process whereby policy makers and advisers exchange and share policy ideas with one another. For example, educational policymakers in countries that perform successfully in international assessments such as the OECD PISA tests routinely find themselves the subject of intense interest from policy makers in other countries, who might want to learn about and borrow their policies to help improve performance in their own country. But, it is claimed, there is rarely much policy learning happening in these exchanges. Policy learning, argues Bob Lingard:
takes account of the research on the effects of the policy in the source system, learning from that and then applying that knowledge to the borrowing system through careful consideration of national and local histories, cultures and so on.
Lingard claims, though, that policy learning is often over-ridden by political values and ideology, so that research knowledge and policy knowledge derived through policy learning become only one part in a ‘policy pastiche’ that is dominated by other political concerns and interests. Policy borrowing and lending certainly happens in other governmental sectors, with key ideas (e.g. ‘personalization’ and ‘co-production’) even being exchanged across different domains of public services provision, often enabled by innovation labs themselves.
The current emphasis on policy learning, civil service learning, and the production of toolkits to operationalize this learning is therefore a significant development in addressing the deficits of policy borrowing. However, interesting work could be done to explore the nature of this learning, and, in particular, to inquire into the kinds of pedagogies of professional policy learning that might be involved, and the pedagogic actors and resources facilitating it.
Even more particularly, such work would need to inquire into the digital technologies involved in the forms of policy learning required for policy professionals to work with new operating models of government. Highly coded computer technologies are now a major part of professional work and learning in many sectors, not least policymaking. For example, if policy learning in the future is likely to involve the use of data analytics and predictive analytics, then it will be important to examine how policy professionals are inducted into their use and application. Evelyn Ruppert has usefully described ‘database government’: the rapid and agile collection and counting of vast datasets, through techniques of data mining, pattern recognition and social network analysis, for the purposes of both monitoring and manipulating people’s behaviour and thus maintaining the social order as a whole. So what are the pedagogies through which policy learners might be inducted into the techniques of database government, and how are pedagogical agents such as policy innovation labs seeking to educate them?
We might also need to look at the resources through which policy learning can take place. For example, Pearson plc’s Learning Curve data bank provides the policy learner with access to over 60 internationally-comparable datasets on educational input and output indicators. Intended to support evidence-based policy, and to help governments, teachers and learners identify the common elements of effective education, it is designed to make educational statistics accessible and easy to use. It features a variety of interactive data visualization instruments, such as graphical time series trends tools and user-manipulable heatmaps that enable diverse quantities and qualities of educational data to be transformed and standardized into a common visual metric. Through the application of ‘visual analytics,’ it allows the user to manipulate the images in order to reveal patterns and associations, to conduct comparisons by altering variables, and to build visual models and explanations. In this way, data visualizations such as the Learning Curve can act as a form of visual reasoning, shaping the kinds of policy learning that can take place and the kinds of policies that might be produced.
What I’m trying to get at here is how the pedagogies of policy learning are integrally bound up in the functioning of digital technologies and resources that have themselves been designed to enable particular kinds of action, to enable particular forms of analysis, and to produce particular kinds of policy insights. One potentially useful way of thinking about this is the idea of ‘programmable pedagogies.’ Programmable pedagogies are the lessons taught by computational systems that have been programmed in accordance with the systems of thinking of technical experts to sculpt particular forms of conduct, catalyze particular behaviours, and delimit particular forms of learning. It’s a term I use to refer to the ways that educational software products project particular codes of conduct into the ways in which they are used. Much contemporary research on software tends toward the argument that the ‘lines of code’ that constitute any application also carry particular codes of conduct; that computer code and algorithms are ‘abstracted theories about the world’ which also ‘have the capacity to become active in shaping and constituting social life’ as the sociologist David Beer argues.
The term ‘programmable pedagogies of policy learning’, then, refers to the ways in which the digital techniques, devices and resources employed in the professional learning of policy professionals might themselves act to shape the kinds of policy analyses and actionable policy insights that are possible. Research in this area would need to inquire into the origins of such devices and resources. How are ‘digital policy instruments’ programmed into being, by which organizations, according to which assumptions? And it might inquire into the ways in which such instruments are received and used by policy learners, or into how their use is framed for them through training courses. Are, for example, data visualization resources designed for the policymaker, such as data dashboards, framed as neutral and apolitical containers of ‘visualized facts’, or are they presented as a socially powerful means for codifying the art of political persuasion into seductive and convincing graphical displays for presentation to different audiences?
These could be important issues and questions to take up as policy learning processes become intertwined with software code, algorithms, and sophisticated methodological and technical techniques of data collection, calculation and circulation. The lines of code and algorithmic forms of data analysis that constitute the programmable pedagogies of policy learning are seriously consequential for the ways in which policymakers will learn to see patterns in data, identify social and public problems, derive actionable policy insights, and put into place new service solutions. If policy borrowing has been shaped by political values and ideology, then digital policy learning could be shaped by the subtler politics and forms of ‘algorithmic power‘ written into software code. Policy innovation labs have positioned themselves as pedagogic intermediaries in this context, with the methodological expertise and pedagogic resources to educate policy professionals in new digital, data and design methods of policy innovation.