Super-fast education policy fantasies

Ben Williamson

data server
Image by CSCW

In recent years the pace of education policy has begun to pick up speed. As new kinds of policy influencers such as international organizations, businesses, consultancies and think tanks have entered into educational debates and decision-making processes, the production of evidence and data to support policy development has become more spatially distributed across sectors and institutions and invested with more temporal urgency too. The increasing availability of digital data that can be generated in real time is now catalysing dreams of an even greater acceleration in policy analysis, decision-making and action. A fantasy of real-time policy action is being ushered into material existence, particularly through the advocacy of the global edu-business Pearson and the international organizations OECD (Organization of Economic Cooperation and Development) and WEF (World Economic Forum). At the same time, the variety of digital data available about aspects of education means that these policy influencers are focusing attention on the possible measurement of previously unrecorded activities and processes.

Fast policy
Education policy processes are undergoing a transformation. A spatial redistribution of policy processes is underway whereby government departments are becoming parts of ‘policy networks’ that also include consultants, think tanks, policy labs, businesses, and international non-governmental organizations.

In their recent book Fast Policy, policy geographers Jamie Peck and Nik Theodore argue that:

The modern policymaking process may still be focused on centers of political authority, but networks of policy advocacy and activism now exhibit a precociously transnational reach; policy decisions made in one jur­isdiction increasingly echo and influence those made elsewhere; and global policy ‘models’ often exert normative power across significant distances. Today, sources, channels, and sites of policy advice encompass sprawling networks of human and nonhuman actors/actants, including consultants, web sites, practitioner communities, norm-­setting models, conferences, guru per­formances, evaluation scientists, think tanks, blogs, global policy institutes, and best-­practice peddlers, not to mention the more ‘hierarchical’ influence of multilateral agencies, international development funds, powerful trading partners, and occupying powers.

These policy networks sometimes do the job of the state through outsourced contracts, commissioned evidence-collection and analysis, and the production of policy consultancy for government. They often also act as channels for the production of policy influence, bringing new agendas, new possibilities, and new solutions to perceived problems into the view of national government departments and policymakers. Policy is, therefore, becoming more socially and spatially distributed across varied sites, across public, private and third sectors, and increasingly involves the hybridization of methods drawn from all the actors involved in it, particularly in relation to the production and circulation of evidence that might support a change in policy.

The socially and spatially networked nature of the contemporary education policy environment is leading to a temporal quickening in the production and communication of evidence. In the term ‘fast policy’, Peck and Theodore describe a new condition of accelerated policy production, circulation and translation that is characterized not just by its velocity but also ‘by the intensified and instantaneous connectivity of sites, chan­nels, arenas, and nodes of policy development, evolution, and reproduction.’ Fast policy refers to the increasing porosity between policymaking locales; the transnationalization of policy discourses and communities; global deference to models of ‘what works’ and ‘best practices’; compressed R&D time in policy design and roll-out; new shared policy experimentality and evaluation practices; and the expansion of a ‘soft infrastructure’ of expert conferences, resource banks, learning networks, case-­study manuals, and web-­ based materials, populated by intermediaries, advocates, and experts.

Fast policy is becoming a feature of education policy production and circulation. As Steven Lewis and Anna Hogan have argued,

actors work within complex policy networks to produce and promote evidence tailored to policymakers, meaning they orchestrate rather than produce research knowledge in order to influence policy production. These actors tend to construct simplified and definitive solutions of best practice, and their reports are generally short, easy-to-read and glossy productions.

As a consequence they claim the desire for policy solutions and new forms of evidence and expertise, is ultimately leading to the ‘speeding up’ of policy:

This ‘speeding up’ of policy, or ‘fast policy’ … is characterized not only by the codification of best practice and ‘ideas that work’ but also, significantly, by the increasing rate and reach of such policy diffusion, from sites of policy development and innovation to local sites of policy uptake and, if not adoption, translation.

In other words, policies are becoming more fast-moving, both in their production and in their translation into action, as well as more transnational in uptake and implementation, more focused on quick-fix ‘best practice’ or ‘what works’ solutions, and more pacey and attractive to read thanks to being packaged up as short glossy handbooks and reports, websites and interactive data visualizations.

For Lewis and Hogan, the development of fast policy in education is exemplified by the work of the education business Pearson and the international organization OECD. In their specific example of fast policy in action, they observe how ‘so-called best practices travel from their point of origin (to the extent that this can ever be definitively fixed) at the OECD to their uptake and development by an international edu-business (Pearson),’ and how they are from there translated into more ‘localized’ concerns with improving state-level schooling performance within national systems. In particular they show how OECD data collected as part of the global PISA testing program have been translated into Pearson’s Learning Curve Databank, itself a public data resource intended to inform ‘evidence-based’ educational policymaking around the world, and from there mobilized in the specification of local policy problems and solutions. The concern with evidence-based policymaking, they show, involves the use of best practice models and learning from ‘examples’:

We see the dominance of fast policy approaches, and hence their broad appeal across policy domains such as schooling, as directly emanating from the promotion of decontextualised best practices that can, so it is alleged, transcend the specific requirements of local contexts. This is despite ‘evidence-based’ policymaking being an inherently political and contingent process, insofar as it is always mediated by judgements, priorities and professional values specific to the people, moments and places in which such policies are to be enacted.

Additionally, in the fast policy approaches that are developing in education through the work of OECD and Pearson, quantitative data have become especially significant for evidence-based practices, as measurement, metrics, ranking and comparison all help to create new continuities and flows that can overcome physical distance in an increasingly interconnected and accelerating digital world. Numbers and examples form the evidential flow of fast policy, enabling complex social, political and economic problems to be rendered in easy-to-understand tables, diagrams and graphs, and their solutions to be narrated and marketed through exemplar best practice case studies.

Real-time policy action
Pearson and OECD are additionally seeking to develop new computer-based data analytics techniques that can be used to generate evidence to inform education policy. Pearson, for example, has proposed a ‘renaissance in assessment’ that will involve a shift to new computer-based assessment systems for the continuous tracking and monitoring of ‘streaming data’ through real-time analytics, rather than the collection of data through discrete temporal assessment events. Its report promotes using ‘intelligent software and a range of devices that facilitate unobtrusive classroom data collection in real time,’ and to ‘track learning and teaching at the individual student and lesson level every day in order to personalise and thus optimise learning.’ Much of the data analytic and adaptive technology required of this vision is in development at Pearson’s own Center for Data Analytics and Adaptive Learning, its in-house centre for educational big data research and development.

Moreover, the authors of the renaissance in assessment report argue for a revolution in education policy, shifting the focus from the governance of education through the institution of the school to ‘the student as the focus of educational policy and concerted attention to personalising learning.’ The report clearly represents an emerging educational imaginary where policy is to concentrate on the real-time tracking of the individual rather than the planned and sequenced longitudinal measurement of the institution or system. Along these lines, its authors note that the OECD itself is moving towards new forms of machine learning in its international assessments technologies, with a proposal to assess collaborative problem solving through ‘a fully computer-based assessment in which a student interacts with a simulated collaborator or “avatar” in order to solve a complex problem.’ Such systems, for Pearson and OECD, can speed up the process of providing feedback to students, but are, importantly, also adaptive, meaning that the content adapts to the progress of the student in real time.

The potential promise of such computer-based adaptive systems, for the experts of Pearson and OECD, is a further acceleration in policy development to real-time speed. Instead of policy based on the long time-scales of temporally discrete assessment events, data analytics platforms appear to make it possible to perform a constant automated analysis of the digital timestream of student activities and tasks. Such systems can then adapt to the student in ways that are synchronized with their learning processes. This process appears to make it feasible to squeeze out conventional standardized assessments and tests, with their association with bureaucratic processes of data collection by governmental centres of political authority, and replace them with computer-adaptive systems.

These proposals imagine a super-fast policy process that is at least partly automated, and certainly accelerated beyond the temporal threshold of human capacities of data analysis and expert professional judgment. Heather Roberts-Mahoney and colleagues have analysed US documents advocating the use of real-time data analytics for personalized learning, and conclude that they transform teachers into ‘data collectors’ who  ‘no longer have to make pedagogical decisions, but rather manage the technology that will make instructional decisions for them,’ since  ‘curriculum decisions, as well as instructional practices, are reduced to algorithms and determined by adaptive computer-based systems that create ‘personalized learning,’ thereby allowing decision-making to take place externally to the classroom.’ The role of policymakers is changed by such systems too, turning them into awarders of contracts to data processing companies and technological vendors of adaptive personalized learning products. It is through such technical platforms and the instructions coded in to them that decisions about intervention will be made at the individual level, rather than bureaucratic decision-making at national or state system scale.

The use of real-time systems in education is therefore part of ‘a reconfiguring of intensities, or “speeds”, of institutional life’ as it is ‘now “plugged into” information networks,’ as Greg Thompson has argued. It makes the collection, analysis and feedback from student data into a synchronous loop that functions at extreme velocity through systems that are hosted by organizations external to the school but are also networked into the pedagogic routines of the adaptive, personalized classroom.

Affective policy
Importantly, these fast policy influencers are also pursuing the possibility of measuring non-academic aspects of learning such as social and emotional skills. The OECD has launched its Education and Social Progress project to develop specific measurement instruments for ‘social and emotional skills such as perseverance, resilience and agreeableness,’ ‘using the evidence collected, for policy-makers, school administrators, practitioners and parents to help children achieve their full potential, improve their life prospects and contribute to societal progress.’

The World Economic Forum, another major international organization that works in policy networks to influence education policy, has similarly produced a report on fostering social and emotional learning through technology. It promotes the development of biosensor technologies, wearable devices and other applications that can be used to ‘provide a minute-by-minute record of someone’s emotional state’ and ‘to help students manage their emotions.’ It even advocates educational applications of ‘affective computing’:

Affective computing comprises an emerging set of innovations that allow systems to recognize, interpret and simulate human emotions. While current applications mainly focus on capturing and analysing emotional reactions to improve the efficiency and effectiveness of product or media testing, this technology holds great promise for developing social and emotional skills such as greater empathy, improved self-awareness and stronger relationships.

The affective analytics of education being proposed by both the OECD and WEF make the emotional life of the school child into the subject of fast policy experimentation. They are seeking to synchronize children’s emotional state, measured as a ‘minute-by-minute record,’ with societal progress, rendering students’ emotions as real-time digital timestreams of data that can be monitored and then used as evidence in the evaluation of various practices and policies. Timestreams of data about how students feel are being positioned by policy influencers the OECD and WEF as a new form of evidence at a time of accelerating policy experimentation. These proposals are making sentiment analysis into a key fast policy technology, enabling policy interventions and associated practices to be evaluated in terms of the feelings they generate–a way of measuring not just the effects of policy action but its production of affect too.

Following super-fast policy prototypes
Writing about fast policy in an earlier paper prefacing their recent book, Jamie Peck and Nik Theodore have described ‘policy prototypes that are moving through mutating policy networks’ and which connect ‘distant policy-making sites in complex webs of experimentation-emulation-evolution.’ They describe the methodological challenges of ‘following the policy’ in the context of spatially distributed policy networks and temporally accelerated modes of policy development where spefific policies are in a constant state of movement, translation and transformation. For them:

Policy designs, technologies, and frames are … complex and evolving social constructions rather than as concretely fixed objects. In fact, these are very often the means and the media through which relations between distant policy-making sites are actively made and remade.

A research focus on the kind of super-fast policy prototypes being developed by Pearson, the WEF and the OECD would likewise need to focus, methodologically, on the technologies and the designs of computer-based approaches as socially created devices. It would need to follow these policy prototypes through processes of experimentation, emulation and mutation, as they are diversely developed, taken up or resisted, and modified and amended through interaction with other organizations, actors, discourses and agendas. As with Peck and Theodore’s focus on fast policy, researching the super-fast policy prototypes proposed for education by the OECD, WEF and Pearson would investigate the ‘social life’ of the production of new technologies of computer-adaptive assessment, personalized learning, affective computing and so on, but also attend to their social productivity as they change the ways in which education systems, institutions, and the individuals within them perform.

Posted in Uncategorized | Tagged , , , , , | Leave a comment

Performing data

‘Performance information’ in the Scottish Government national improvement plan for education

Ben Williamson

ScotGov plan

At the end of June 2016 the Scottish Government published a major national delivery plan for improving Scottish education over the next few years. Drafted in response to a recent independent review of Scottish education carried out by the OECD, the delivery plan is part of a National Improvement Framework with ambitious plans to raise attainment and achieve equity.

It is the relentless focus of the delivery plan on the use of performance measurement, metrics and evidence gathering to drive forwards these improvements that is especially arresting. In a striking line from the introduction it is stated that:

As the OECD review highlighted, current … arrangements do not provide sufficiently robust information across the system to support policy and improvement. We must move from a culture of judgement to a system of judgement.

A ‘system of judgment’: right from the start, it is clear that the delivery plan is based on the understanding—imported from the OECD via its recommendation that new ‘metrics’ be devised to measure Scottish education—that data can be used to drive forward performance improvement and for the purposes of disciplining under-performance.

Productive measurement
In a series of articles, the sociologist David Beer has been writing about the socially productive power of metrics in a variety of sectors and institutions of society:

We often think of measurement as in some way capturing the properties of the world we live in. This might be the case, but we can also suggest that the way that we are measured produces certain outcomes. We adapt to the systems of measurement that we are living within.

Metrics and measurements are not simply descriptive of the world, then, but play a part in reshaping it in particular ways, affecting how people behave and understand things and act to do things differently. As Beer elaborates:

The measurements themselves matter, but it is knowing or expecting how we will be measured that is really powerful. Systems of measurement then have productive powers in our lives, both in terms of how we respond to them and how they inform the judgments and decisions that impact upon us.

Performance measurement techniques, of the kind to be implemented through the Scottish Government’s proposed ‘system of judgement’, can similarly be understood as productive measures that will be used to attach evaluative numbers to practices and institutions in ways that are intended to change how the system performs overall. This is likely to affect how school teachers, leaders, and maybe even pupils themselves and their parents act and perform their roles, as they expect to be measured, judged, and acted upon as a result.

‘Performance information’ is one of the key ‘drivers of improvement’ listed in the plan, and clearly shows how a range of ‘measures’ are to be collected:

We will pull together all the information and data we need to support improvement.  Evidence suggests … we must ensure we build a sound understanding of the range of factors that contribute to a successful education system. This is supported by international evidence which confirms that there is no specific measure that will provide a picture of performance. We want to use a balanced range of measures to evaluate Scottish education and take action to improve further.

Scanning through the plan and the improvement framework, it becomes clear just how extensive this new focus on performance measurement will become. The plan emphasizes:

  • the use of standardized assessment to gather attainment data
  • the gathering of diverse data about the academic progress and well-being of pupils at all stages
  • pre-inspection questionnaires, school inspection and local authority self-evaluation reports
  • the production of key performance indicators on employability skills
  • greater performance measurement and measurement of schools
  • new standards and evaluation frameworks for schools
  • information on teacher induction, teacher views, and opportunities for professional learning
  • evidence on the impact of parents in helping schools to improve
  • regular publication of individual school data
  • the use of visual data dashboards to make school data transparent
  • training for ‘data literacy’ among teachers
  • comparison with international evidence

All of this is in addition to system-wide national benchmarking, international comparisons, defining and monitoring standards, quality assurance, and is all to be overseen by an international council of expert business and reformatory advisors to guide and evaluate its implementation.

Performative numbers
The delivery plan makes for quite a cascade of new and productive measures–an ‘avalanche of numbers‘ –though Scottish schools are unlikely to be terribly surprised by the emphasis in the delivery plan on performance information, targets, performance indicators and timelines. (In England the emphasis on performance data has been even more pronounced, with Paul Morris claiming ‘the purposes of schooling and what it means to be educated are effectively being redefined by the metrics by which we evaluate schools and pupils.’)

Since 2014, all Scottish schools have been encouraged by the Scottish Government to make use of Insight, an online benchmarking tool ‘designed for use by secondary schools and local authorities to identify success and areas where improvements can be made, with the ultimate aim of making a positive difference for pupils’. It provides data on ‘four national measures, including post-school destinations and attainment in literacy and numeracy as well as information on a number of local measures designed to help users take a closer look at their curriculum, subjects and courses’. It features data dashboards that allow schools to view an overall picture of the data from their school and compare it with the national measures presented on the national dashboard.

A notable feature of Insight is the ‘Virtual Comparator’ which allows users to see how the performance of their pupils compares to a similar group of pupils from across Scotland. The Virtual Comparator feature takes the characteristics of pupils in a school and matches them to similar pupils from across Scotland to create a ‘virtual school’ against which a ‘real’ school may benchmark its progress.

The relentless focus by the Scottish Government on performance information, inspection, comparison, measurement and evidence is demonstrative of how education systems, organizations and individuals are now the subjects of increasing demands of producing data.

As the concept of ‘productive measures’ reminds us, though, performance measurement is not simply descriptive. It also brings the matter it describes into being. Captured in the term ‘performativity,’ it has become apparent that education systems and institutions, and even individuals themselves, are changing their practices to ensure the best possible measures of performance. Closely linked to this is the notion of accountability, that is, the production of evidence that proves the effectiveness—in terms of measurable results—of whatever has been performed in the name of improvement and enhancement. As Stephen Ball phrases it:

Performativity is … a regime of accountability that employs judgements, comparisons and displays as a means of control, attrition and change. The performance of individuals and organizations serve as measures of productivity or output … [and] stand for, encapsulate or represent the worth, quality or value of an individual or organization within a field of judgement.

In other words, performativity makes the question of what counts as worthwhile activity in education into the question of what can be counted and of what account can be given for it. It reorients institutions and individuals to focus on those things that can be counted and accounted for with evidence of their delivery, results and positive outcomes, and de-emphasises any activities that cannot be easily and effectively measured.

In practical terms, performativity depends on databases, audits, inspections, reviews, reports, and the regular publication of results, and tends to prioritize the practices and judgements of accountants, lawyers and managers who subject practitioners to constant processes of target-setting, measurement, comparison and evaluation. The appointment of an international council of experts to oversee the collection and analysis of all the performance information required by the improvement and delivery plans is ample illustration of how Scottish education will be subject to a system of expert techniques and judgement.

Political analytics
It is hard, then, to see the Scottish Government delivery plan as anything other than a series of policy instruments that via specific data-driven techniques and particular technical tools will reinforce performativity and accountability, all under the aspiration of closing attainment gaps and achieving equity.

Although no explicit mention is made of the technologies required to enact this system of judgement, it is clear that a complex data infrastructure of technologies and technical experts will also be needed to collect, store, clean, filter, analyse, visualize and communicate the vast masses of performance information. Insight and other dashboards already employed in Scottish education are existing products that doubtless anticipate a much more system-wide digital datafication of the sector. Data processing technologies are making the performance of education systems and institutions into enumerated timestreams of data by which they might be measured, evaluated and assessed, held up to both political and public scrutiny, and then made to account for their actions and decisions, and either rewarded or disciplined accordingly. A new kind of political analytics that prioritizes digitized forms of data collection and analysis is likely to play a powerful role in the governance of Scottish education in coming years.

Data technologies of various kinds are the enablers of performativity and accountability, and translate the numerical logics of the technologies into the material and practical realities of professional life. As a data-driven ‘system of judgement’, Scotland’s delivery plan for education will, in other words, usher in more and more ‘productive measures’ into Scottish education, reconfiguring it and those who work and learn in it in ways that will need to be studied closely for many years to come.

 

Posted in Uncategorized | Tagged , , | Leave a comment

Critical questions for big data in education

Ben Williamson

data center
Image: https://flic.kr/p/bnZvFX

Big data has arrived in education. Educational data science, learning analytics, computer adaptive testing, assessment analytics, educational data mining, adaptive learning platforms, new cognitive systems for learning and even educational applications based on artificial intelligence are fast becoming parts of the educational landscape, in schools, colleges and universities, as well as in the networked spaces of online courses.

As part of a recent conversation about the Shadow of the Smart Machine work on machine learning algorithms being undertaken by Nesta, I was asked what I thought were some the most critical questions about big data and machine learning in education. This reminded me of the highly influential paper ‘Critical questions for big data’ by danah boyd and Kate Crawford, in which they ‘ask critical questions about what all this data means, who gets access to what data, how data analysis is deployed, and to what ends.’

With that in mind, here are some preliminary (work-in-progress) critical questions to ask about big data in education.

How is ‘big data’ being conceptualized in relation to education?
Large-scale data collection has been at the centre of the statistical measurement, comparison and evaluation of the performance of education systems, policies, institutions, staff and students since the mid-1800s. Does big data constitute a novel way of enumerating education? The sociologist David Beer has suggested we need to think about the ways in which big data as both a concept and a material phenomenon has appeared as part of a history of statistical thinking, and in relation to the rise of the data analytics industry—he suggests social science still needs to understand ‘the concept itself, where it came from, how it is used, what it is used for, how it lends authority, validates, justifies, and makes promises.’ Within education specifically, how is big data being conceptualized, thought about, and used to animate specific kinds of projects and technical developments? Where did it come from–data science, computer science–and who are its promoters and sponsors in education? What promises are attached to the concept of big data as it is discussed within the domain of education? We might wish to think about a ‘big data imaginary’ in education—a certain way of thinking about, envisaging and visioning the future of education through the conceptual lens of big data—that is now animating specific technical projects, becoming embedded in the material reality of educational spaces and enacted in practice.

What theories  of learning underpin big data-driven educational technologies?
Big data-driven platforms such as learning analytics aim to ‘optimize learning’ but is it always clear what is meant by ‘learning’ by the organizations and actors that build, promote and evaluate them? Much of the emerging field of ‘educational data science’—which encompasses much educational data mining, learning analytics and adaptive learning software R&D—is informed by conceptualizations of learning that are rooted in cognitive science and cognitive neuroscience. These disciplines tend to focus on learning as an ‘information-processing’ event—to treat learning as something that can be monitored and optimized like a computer program—and pay less attention to the social, cultural, political and economic factors that structure education and individuals’ experiences of learning.

Given the statistical basis of big data, it’s perhaps also not surprising that many actors involved in educational big data analyses are deeply informed by the disciplinary practices and assumptions of psychometrics and its techniques of psychological measurement of knowledge, skills, personality and so on. Aspects of behaviourist theories of learning even persist in behaviour management technologies that are used to collect data on students’ observed behaviours and distribute rewards to reinforce desirable conduct. There is an emerging tension between the strongly psychological, neuroscientific and computational ways of conceptualizing and theorizing learning that dominate big data development in education, and more social scientific critiques of the limitations of such theories.

How are machine learning systems used in education being ‘trained’ and ‘taught’?
The machine learning algorithms that underpin much educational data mining, learning analytics and adaptive learning platforms need to be trained, and constantly tweaked, adjusted and optimized to ensure accuracy of results–such as predictions about future events. This requires ‘training data,’ a corpus of historical data that the algorithms can bee ‘taught’ with to then use to find patterns in data ‘in the wild.’ Who selects the training data? How do we know if it is appropriate, reliable and accurate? What if the historical data is in some ways biased, incomplete or inaccurate? Does this risk generating ‘statistical discrimination’ of the sort produced by ‘predictive policing,’ which has in some cases been found to disproportionately predict that black men will commit crime? Educational research has long asked questions about the selection of the knowledge for inclusion in school curricula that are to be taught to students—we may now need to ask about the selection of the data for inclusion in the training corpus of machine learning platforms, as these data could be consequential for learners’ subsequent educational experience.

Moreover, we might need to ask questions about the nature of the ‘learning’ being experienced by machine learning algorithms, particularly as enthusiastic advocates in places like IBM are beginning to propose that advanced machine learning is more ‘natural,’ with ‘human qualities,’ based on computational models of aspects of human brain functioning and cognition. To what extent do such claims appear to conflate understandings of the biological neural networks of the human brain that are mapped by neuroscientists with the artificial neural networks designed by computer scientists? Does this reinforce computational information-processing conceptualizations of learning, and risk addressing young human minds and the ‘learning brain’ as computable devices that can be debugged and rewired?

Who ‘owns’ educational big data?
The sociologist Evelyn Ruppert has asked ‘who owns big data?’, noting that numerous people, technologies, practices and actions are involved in how data is shaped, made and captured. The technical systems for conducting educational big data collection, analysis and knowledge production are expensive to build. Specialist technical staff are required to program and maintain them, to design their algorithms, to produce their interfaces. Commercial organizations see educational data as a potentially lucrative market, and ‘own’ the systems that are now being used to see, know and make sense of education and learning processes. Many of their systems are proprietorial, and are wrapped in IP and patents which makes it impossible for other parties to understand how they are collecting data, what analyses they are conducting, or how robust their big data samples are. Specific commercial and political ambitions may also be animating the development of educational data analytics platforms, particularly those associated with Silicon Valley where ed-tech funding for data-driven applications is soaring and tech entrepreneurs are rapidly developing data-driven educational software and even new institutions.

In this sense, we need to ask critical questions about how educational big data are made, analysed and circulated within specific social, disciplinary and institutional contexts that often involve powerful actors that possess significant economic capital in the shape of funding and resourcing, cultural capital in terms of the production of new specialist knowledge, and social capital through wider networks of affiliations, partnerships and connections. The question of the ownership of educational big data needs to be located in relation to these forms of capital and the networks where they circulate.

Who can ‘afford’ educational big data?
Not all schools, colleges or universities can necessarily afford to purchase a learning analytics or adaptive software platform—or to partner with platform providers. This risks certain wealthy institutions being able to benefit from real-time insights into learning practices and processes that such analytics afford, while other institutions will remain restricted to the more bureaucratic analysis of temporally discrete assessment events.

Can educational big data provide a real-time alternative to temporally discrete assessment techniques and bureaucratic policymaking?
Policy makers in recent years have depended on large-scale assessment data to help inform decision-making and drive reform—particularly the use of large-scale international comparative data such as the datasets collected by OECD testing instruments. Educational data mining and analytics can provide a real-time stream of data about learners’ progress, as well as automated real-time personalization of learning content appropriate to each individual learner. To some extent this changes the speed and scale of educational change—removing the need for cumbersome assessment and country comparison and distancing the requirement for policy intervention. But it potentially places commercial organizations (such as the global education business Pearson) in a powerful new role in education, with the capacity to predict outcomes and shape educational practices at timescales that government intervention cannot emulate.

Is there algorithmic accountability to educational analytics?
Learning analytics is focused on the optimization of learning and one of its main claims is the early identification of students at-risk of failure. What happens if, despite being enrolled on a learning analytics system that has personalized the learning experience for the individual, that individual still fails? Will the teacher and institution be accountable, or can the machine learning algorithms (and the platform organizations that designed them) be held accountable for their failure? Simon Buckingham Shum has written about the need to address algorithmic accountability in the learning analytics field, and noted that ‘making the algorithms underpinning analytics intelligible’ is one way of at least making them more transparent and less opaque.

Is student data replacing student voice?
Data are sometimes said to ‘speak for themselves,’ but education has a long history of encouraging learners to speak for themselves too. Is the history of pupil voice initiatives being overwritten by the potential of pupil data, which proposes a more reliable, accurate, objective and impartial view of the individual’s learning process unencumbered by personal bias? Or can student data become the basis for a data-dialogic form of student voice, one in which teachers and their students are able to develop meaningful and caring relationships through mutual understanding and discussion of student data?

Do teachers need ‘data literacy’?
Many teachers and school leaders possess little detailed understanding of the data systems that they are using, or required to use. As glossy educational technologies like ClassDojo are taken up enthusiastically by millions of teachers worldwide, might it be useful to ensure that teachers can ask important questions about data ethics, data privacy, data protection, and be able to engage with educational data in an informed way? Despite calls in the US to ensure that data literacy become the focus for teachers’ pre-service training, there appears little sign that the provision of data literacy education for educational practitioners is being developed in the UK.

What ethical frameworks are required for educational big data analysis and data science studies?
The UK government recently published an ethical framework for policymakers for use when planning data science projects. Similar ethical frameworks to guide the design of educational big data platforms and education data science projects are necessary.

Some of these questions clearly need more work, but make clear I think the need for more work to critically interrogate big data in education.

Posted in Uncategorized | Tagged , , , , , , | 2 Comments

Artificial intelligence, cognitive systems and biosocial spaces of education

By Ben Williamson

Brewbook_Wired brain_2012
Image: telephone cable model of corpus callosum by Brewbooks

Recently, new ideas about ‘artificial intelligence’ and ‘cognitive computing systems’ in education have been advanced by major computing and educational businesses. How might these ideas and the technical developments and business ambitions behind them impact on educational institutions such as schools, and on the role of human actors such as teachers and learners, in the near future? More particularly, what understandings of the human teacher and the learner are assumed in the development of such systems, and with what potential effects?

The focus here is on the education business Pearson, which published a report entitled Intelligence Unleashed: An argument for AI in education in February 2016, and the computing company IBM, which launched Personalized Education: from curriculum to career with cognitive systems in May 2016. Pearson’s interest in AI reflects its growing profile as an organization using advanced forms of data analytics to measure educational institutions and practices  while IBM’s report on cognitive systems makes a case for extending its existing R&D around cognitive computing into the education sector.

AI has been the subject of serious concern recently, with warnings from high-profile figures including Stephen Hawking, Bill Gates and Elon Musk, while awareness about cognitive computing has been fuelled by widespread media coverage of Google’s AlphaGo system, which beat one of the world’s leading Go players back in March. Commenting on these recent events, the philosopher Luciano Floridi has noted that contemporary AI and cognitive computing, however, cannot be characterized in monolithic terms as some kind of ‘ultraintelligence’; instead it is  manifesting itself in far more mundane ways through an ‘infosphere’ of ‘ordinary artefacts that outperform us in ever more tasks, despite being no cleverer than a toaster’:

The success of our technologies depends largely on the fact that, while we were speculating about the possibility of ultraintelligence, we increasingly enveloped the world in so many devices, sensors, applications and data that it became an IT-friendly environment, where technologies can replace us without having any understanding, mental states, intentions, interpretations, emotional states, semantic skills, consciousness, self-awareness or flexible intelligence. Memory (as in algorithms and immense datasets) outperforms intelligence when landing an aircraft, finding the fastest route from home to the office, or discovering the best price for your next fridge. Digital technologies can do more and more things better than us, by processing increasing amounts of data and improving their performance by analysing their own output as input for the next operations.

Contemporary algorithmic forms of AI that learn from the vast memory-banks of big data do not constitute either an apocalyptic or benevolent future of AI or cognitive systems, but, for Floridi, reflect human ambitions and problems.

So why are companies like Pearson and IBM advancing claims for their benefits in education, and to address which ambitions and problems? Extending from my recent work on both Pearson’s digital methods and IBM’s cognitive systems R&D programs (all part of an effort to map out the emerging field of ‘educational data science’), I suggest these developments can be understood in terms of growing recognition of the connections between computer technologies, social environments, and embodied human experience.

Pearson intelligence
Pearson has been promoting itself as a new source of expertise in educational big data analysis since establishing its Center for Digital Data, Analytics and Adaptive Learning in 2012. Its ambitions in the direction of educational data analytics are to make sense of the masses of data becoming available as educational activities increasingly occur via digital media, and to use these data and patterns extracted from them to derive new theories of learning processes, cognitive development, and non-academic social and emotional learning. It has also begun publishing reports under its ‘Open Ideas’ theme, which aim to make its research available publicly. It is under the Open Ideas banner that Pearson has published Intelligence Unleashed (authored by Rose Luckin and Wayne Holmes of the London Knowledge Lab at the University College London).

Pearson’s report proposes that artificial intelligence can transform teaching and learning. Its authors state that:

Although some might find the concept of AIEd alienating, the algorithms and models that comprise AIEd form the basis of an essentially human endeavour. AIEd offers the possibility of learning that is more personalised, flexible, inclusive, and engaging. It can provide teachers and learners with the tools that allow us to respond not only to what is being learnt, but also to how it is being learnt, and how the student feels.

Rather than seeking to construct a monolithic AI system, Pearson is proposing that a ‘marketplace’ of thousands of AI components will eventually combine to ‘enable system-level data collation and analysis that help us learn much more about learning itself and how to improve it.’

Underpinnings its vision of AIEd is a particular concern with ‘the most significant social challenge that AI has already brought – the steady replacement of jobs and occupations with clever algorithms and robots’:

It is our view that this phenomena provides a new innovation imperative in education, which can be expressed simply: as humans live and work alongside increasingly smart machines, our education systems will need to achieve at levels that none have managed to date.

In other words, in the Pearson view, a marketplace of AI applications will both be able to provide detailed real-time data analytics on education and learning, and also lead to far greater levels of achievement by both individuals and whole education systems. Its vision is of augmented educational systems, spaces and practices where humans and machines work symbiotically.

In technical terms, what Pearson terms AIEd relies on a particular form of AI. This is not the AI with sentience of sci-fi imaginings, but AI reimagined through the lens of big data and data analytics techniques–the ‘ordinary artefacts’ of machine learning systems. Notably, the report refers to advances in machine learning algorithms, computer modelling, statistics, artificial neural networks and neuroscience, since ‘AI involves computer software that has been programmed to interact with the world in ways normally requiring human intelligence. This means that AI depends both on knowledge about the world, and algorithms to intelligently process that knowledge.’

In order to do so, and importantly, Pearson’s brand of AIEd requires the development of sophisticated computational models. These include models of the learner, models of effective pedagogy, and models of the knowledge domain to be learned, as well as models that represent the social, emotional, and meta-cognitive aspects of learning:

Learner models are ways of representing the interactions that happen between the computer and the learner. The interactions represented in the model (such as the student’s current activities, previous achievements, emotional state, and whether or not they followed feedback) can then be used by the domain and pedagogy components of an AIEd programme to infer the success of the learner (and teacher). The domain and pedagogy models also use this information to determine the next most appropriate interaction (learning materials or learning activities). Importantly, the learner’s activities are continually fed back into the learner model, making the model richer and more complete, and the system ‘smarter’.

Based on the combination of these models with data analytics and machine learning processes, Pearson’s proposed vision of AIEd includes the development of Intelligent Tutoring Systems (ITS) which ‘use AI techniques to simulate one-to-one human tutoring, delivering learning activities best matched to a learner’s cognitive needs and providing targeted and timely feedback, all without an individual teacher having to be present.’ It also promises intelligent support for collaborative working—such as AI agents that can integrate into teamwork—and intelligent virtual reality environments that simulate authentic contexts for learning tasks. Its vision is of teachers supported by their own AIEd teaching assistants and AIEd-led professional development.

These techniques and applications are seen as contributors to a whole-scale reform of education systems:

Once we put the tools of AIEd in place as described above, we will have new and powerful ways to measure system level achievement. … AIEd will be able to provide analysis about teaching and learning at every level, whether that is a particular subject, class, college, district, or country. This will mean that evidence about country performance will be available from AIEd analysis, calling into question the need for international testing.

In other words, Pearson is proposing to bypass the cumbersome bureaucracy of mass standardized testing and assessment, and instead focus on real-time intelligent analytics conducted up-close within the pedagogic routines of the AI-enhanced classroom. This will rely on a detailed and intimate analytics of individual performance, which will be gained from detailed modelling of learners through their data.

Pearson’s vision of intelligent, personalized learning environments is therefore based on its new understandings of ‘how to blend human and machine intelligence effectively.’ Specific kinds of understandings of human intelligence and cognition are assumed here. As Pearson’s AIEd report acknowledges,

AIEd will continue to leverage new insights in disciplines such as psychology and educational neuroscience to better understand the learning process, and so build more accurate models that are better able to predict – and influence – a learner’s progress, motivation, and perseverance. … Increased collaboration between education neuroscience and AIEd developers will provide technologies that can offer better information, and support specific learning difficulties that might be standing in the way of a child’s progress.

These points highlight how the design of AIEd systems will embody neuroscientific insights into learning processes–insights that will then be translated into models that can be used to predict and intervene in individuals’ learning processes. This reflects the recent and growing interest in neuroscience in education, and the adoption of neuroscientific insights for ‘brain-targeted‘ teaching and learning. Such practices target the brain for educational intervention based on neuroscientific knowledge. IBM has taken inspiration from neuroscience even further in its cognitive computing systems for education.

IBM cognition
One of the world’s most successful computing companies, IBM has recently turned its attention to educational data analytics. According to its paper on ‘the future of learning’:

Analytics translates volumes of data into insights for policy makers, administrators and educators alike so they can identify which academic practices and programs work best and where investments should be directed. By turning masses of data into useful intelligence, educational institutions can create smarter schools for now and for the future.

An emerging development in IBM’s data analytic approach to education is ‘cognitive learning systems’ based on neuroscientific methodological innovations, technical developments in brain-inspired computing, and artificial neural networks algorithms. Over the last decade, IBM has positioned itself as a dominant research centre in cognitive computing, with huge teams of engineers and computer scientists working on both basic and applied research in this area. Its own ‘Brain Lab’ has provided the neuroscientific insight for these developments, leading to R&D in a variety of areas. Its work has proceeded through neuroscience and neuroanatomy to supercomputing, to a new computer architecture, to a new programming language, to artificial neural network algorithms, and finally cognitive system applications, all underpinned by its understanding of the human brain’s synaptic structures and functions.

IBM itself is not seeking to build an artificial brain but a computer inspired by the brain and certain neural structures and functions. It claims that cognitive computing aims to ’emulate the human brain’s abilities for perception, action and cognition,’ and has dedicated extensive R&D to the production of ‘neurosynaptic brain chips’ and scalable ‘neuromorphic systems,’ as well as its cognitive supercomputing system Watson. Based on this program of work, IBM defines cognitive systems as ‘a category of technologies that uses natural language processing and machine learning to enable people and machines to interact more naturally to extend and magnify human expertise and cognition.’

To apply its cognitive computing applications in education, IBM has developed a specific Cognitive Computing for Education program. Its program director has presented its intelligent, interactive systems that combine neuroscientific insights into cognitive learning processes with neurotechnologies that can:

learn and interact with humans in more natural ways. At the same time, advances in neuroscience, driven in part by progress in using supercomputers to model aspects of the brain … promise to bring us closer to a deeper understanding of some cognitive processes such as learning. At the intersection of cognitive neuroscience and cognitive computing lies an extraordinary opportunity … to refine cognitive theories of learning as well as derive new principles that should guide how learning content should be structured when using cognitive computing based technologies.

The prototype innovations developed by the program include automated ‘cognitive learning content’, ‘cognitive tutors’ and ‘cognitive assistants for learning’ that can understand the learner’s needs and ‘provide constant, patient, endless support and tuition personalized for the user.’ IBM has also developed an application called Codename: Watson Teacher Advisor, that is designed to observe, interpret and evaluate information to make informed decisions that should provide guidance and mentorship to help teachers improve their teaching.

IBM’s latest report on cognitive systems in education proposes that ‘deeply immersive interactive experiences with intelligent tutoring systems can transform how we learn,’ ultimately leading to the ‘utopia of personalized learning’:

Until recently, computing was programmable – based around human defined inputs, instructions (code) and outputs. Cognitive systems are in a wholly different paradigm of systems that understand, reason and learn. In short, systems that think. What could this mean for the educators? We see cognitive systems as being able to extend the capabilities of educators by providing deep domain insights and expert assistance through the provision of information in a timely, natural and usable way. These systems will play the role of an assistant, which is complementary to and not a substitute for the art and craft of teaching. At the heart of cognitive systems are advanced analytic capabilities. In particular, cognitive systems aim to answer the questions: ‘What will happen?’ and ‘What should I do?’

Rather than being hard-programmed, cognitive computing systems are designed like the brain to learn from experience and adapt to environmental stimuli. Thus, instead of seeking to displace the teacher, IBM sees cognitive systems as optimizing and enhancing the role of the teacher, as a kind of cognitive prosthetic or machinic extension of human qualities. This is part of a historical narrative about human-computer hybridity that IBM has wrapped around its cognitive computing R&D:

Across industries and professions we believe there will be an increasing marriage of man and machine that will be complementary in nature. This man-plus-machine process started with the first industrial revolution, and today we’re merely at a different point on that continuum. At IBM, we subscribe to the view that man plus machine is greater than either on their own.

As such, for IBM,

We believe technology will help educators to improve student outcomes, but must be applied in context and under the auspices of a ‘caring human’. The teacher-to-system relationship does not, in our view, lead to a dystopian future in which the teacher plays second fiddle to an algorithm.

The promise of cognitive computing for IBM is not just of more ‘natural systems’ with ‘human qualities,’ but a fundamental reimagining of the ‘next generation of human cognition, in which we think and reason in new and powerful ways,’ as claimed its white paper ‘Computing, cognition and the future of knowing’:

It’s true that cognitive systems are machines that are inspired by the human brain. But it’s also true that these machines will inspire the human brain, increase our capacity for reason and rewire the ways in which we learn.

A recursive relationship between machine cognition and human cognition is assumed in this statement. It sees cognitive systems as both brain-inspired and brain-inspiring, both modelled on the brain and remoulding the brain through interacting with users. The ‘caring human’ teacher mentioned in its report above is one whose capacities are not displaced by algorithms, but are algorithmically augmented and extended. Similarly, the student enrolled into a cognitive learning system is also part of a hybrid system. Perhaps the clearest illustration from IBM of how cognitive systems will penetrate into education systems is its vision of a ‘cognitive classroom.’ This is a ‘classroom that will learn you‘ through constant and symbiotic interaction between cognizing human subjects and nonhuman cognitive systems designed according to a model of the human brain.

Biosocial spaces
Some of the claims in these reports from Pearson and IBM may sound far-fetched and hyperbolic. It’s worth noting, however, that most of the technical developments underpinning them are already part of cutting-edge R&D in both the computing and neuroscience sectors. Two recent ‘foresight’ reports produced by the Human Brain Project document many of these developments and their implications. One, Future Neuroscience, details attempts to map the human brain, and ultimately understand it, through ‘big science’ techniques of data analysis and brain simulation. The other, Future Computing and Robotics, focuses on the implications of ‘machine intelligence,’ ‘human-machine integration,’ and other neurocomputational technologies that use the brain as inspiration; it states:

The power of these innovations has been increased by the development of data mining and machine learning techniques, that give computers the capacity to learn from their ‘experience’ without being specifically programmed, constructing algorithms, making predictions, and then improving those predictions by learning from their results, either in supervised or unsupervised regimes. In these and other ways, developments in ICT and robotics are reshaping human interactions, in economic activities, in consumption and in our most intimate relations.

These reports are the product of interdisciplinary research between sociologists and neuroscientists, and are part of a growing social scientific interest in ‘biosocial’ dynamics between biology and social environments.

Biosocial studies emphasize how social environments are now understood to ‘get under the skin’ and to actually influence the biological functions of the body. In a recent introduction to special issue on ‘biosocial matters,’ it was claimed that a key insight coming out of social scientific attention to biology is ‘the increasing understanding that the brain is a multiply connected device profoundly shaped by social influences,’ and that ‘the body bears the inscriptions of its socially and materially situated milieu.’ Concepts such as ‘neuroplasticity’ and ‘epigenetics’ are key here. Simply put, neuroplasticity recognizes that the brain is constantly adapting to external stimuli and social environments, while epigenetics acknowledges that social experience modulates the body at the genetic level. According to such work, the body and the brain are influenced by the structures and environments that constitute society, but are also the source for the creation of new kinds of structures and environments which will in turn (and recursively) shape life in the future.

As environments become increasingly inhabited by machine intelligence–albeit the machine intelligence of ordinary artefacts rather than superintelligences–then computer technologies need to be considered as part of the biosocial mix. Indeed, IBM’s R&D in cognitive computing fundamentally depends on its own neuroscientific findings about neuroplasticity, and the translation of biological neural networks used in computational neuroscience into the artificial neural networks used in cognitive computing and AI research.

Media theorist N Katherine Hayles has mobilized a form of biosocial inquiry in her recent work on ‘nonconscious cognitive systems’ which increasingly permeate information and communication networks and devices. For her, cognition in some instances may be located in technical systems rather than in the mental world of an individual participant, ‘an important change from a model of cognition centered in the self.’ Her non-anthropocentric view of ‘cognition everywhere’ suggests that cognitive computing devices can employ learning processes that are modelled like those of embodied biological organisms, using their experiences to learn, achieve skills and interact with people. Therefore, when nonconscious cognitive devices penetrate into human systems, they can then potentially modify the dynamics of human behaviours through changing brain morphology and functioning. The potential of nonhuman neurocomputational techniques based on the brain, then, is to become legible as traces in the neurological circuitry of the human brain itself, and to impress itself on the cerebral lives of both individuals and wider populations.

Biosocial explanations are beginning to be applied to education and learning. Jessica Pykett and Tom Disney have shown, for example, that:

an emphasis on the biosocial determinants of children’s learning, educational outcomes and life chances resonates with broader calls to develop hybrid accounts of social life which give adequate attention to the biological, the nonhuman, the technological, the material, … the neural and the epigenetic aspects of ‘life itself.’

In addition, Deborah Youdell‘s new work on biosocial education proposes that such conceptualizations might change our existing understandings of processes such as learning:

Learning is an interaction between a person and a thing; it is embedded in ways of being and understanding that are shared across communities; it is influenced by the social and cultural and economic conditions of lives; it involves changes to how genes are expressed in brain cells because it changes the histones that store DNA; it means that certain parts of the brain are provoked into electrochemical activity; and it relies on a person being recognised by others, and recognising themselves, as someone who learns. … These might be interacting with each other – shared meanings, gene expression, electrochemical signals, the everyday of the classroom, and a sense of self are actually all part of one phenomenon that is learning.

We can begin to understand what Pearson and IBM are proposing in the light of these emerging biosocial explanations and their application to emerging forms of neurocomputation. To some extent, Pearson and IBM are mobilizing biosocial explanations in the development of their own techniques and applications. Models of neural plasticity and epigenetics emerging from neuroscience have inspired the development of  cognitive computing systems, which are then used to activate environments such as Pearson’s AIEd intelligent learning environments or IBM’s cognitive classroom. These are reconfigured as neurocomputationally ‘brainy spaces’ in which learners are targeted for cognitive enhancement and neuro-optimization through interacting with other nonconscious cognitive agents and intelligent environments.

In brief, the biosocial process assumed by Pearson and IBM proceeds something like this:

> Neurotechnologies of brain imaging and simulation lead to new models and understandings of brain functioning and learning processes
> Models of brain functions are encoded in neural network algorithms and other cognitive and neurocomputational techniques
> Neurocomputational techniques are built-in to AIEd and cognitive systems applications for education
> AIEd and cognitive systems are embedded into the social environment of education institutions as ‘brain-targeted’ learning applications
> Educational environments are transformed into neuro-inspired, computer-augmented ‘brainy spaces’
> The brainy space of the educational environment interacts with human actors, getting ‘under the skin’ by becoming encoded in the embodied human learning brain
> Human brain functions are augmented, extended and optimized by machine intelligences

In this way, brain-based machine intelligences are proposed to meet the human brain, and, based on principles of neuroplasticity and epigenetics, to influence brain morphology and cognitive functioning. The artificially intelligent, cognitive educational environment is, in other words, translated into a hybrid, algorithmically-activated biosocial space in the visions of Pearson and IBM. Elsewhere, I’ve articulated the idea of brain/code/space–based on geographical work on technologically-mediated environments–to  describe environments that possess brain-like functions of learning and cognition performed by algorithmic processes. Pearson and IBM are proposing to turn educational environments into brain/code/spaces that are both brain-based and brain-targeted.

While we need to be cautious of the extent to which these developments might (or might not) actually occur (or be desirable), it is important to analyse them as part of a growing interest in how technologically-enhanced social environments based on the brain might interweave with the neurobiological mechanisms that underlie processes of learning and development. In other words, Pearson’s interest in AIEd and IBM’s application of cognitive systems to education need to be interpreted as biosocial matters of significant contemporary concern.

Of course, as Neil Selwyn cautions, technological changes in education cannot be inevitable or wholly beneficial. There are commercial and economic drivers behind them that do not necessarily translate smoothly into education, and most ‘technical fixes’ fail to have the impact intended by their designers and sponsors. A fuller analysis of Pearson’s aims for AIEd or IBM’s ambitions for cognitive systems in education would therefore need to acknowledge the business plans that animate them, and critically consider the visions of the future of education they are seeking to catalyse.

More pressingly, it would need to develop detailed insights into the ways that the brain is being mapped, known, understood, modelled and simulated in institutional contexts such as IBM, or how neuroscientific insights and models are being embodied in the kinds of AI applications that Pearson is promoting.  How IBM and Pearson conceive the brain is deeply consequential to the AI and cognitive systems they are developing, and to how those systems then might interact with human actors and possibly influence the cognition of those people by shaping the neural architectures of their brains. Are these models adequate approximations of human mental and cognitive functioning? Or do they treat the brain and cognition in reductive terms as a kind of computational system that can be debugged,  rewired and algorithmically optimized, in ways which reproduce the long-standing tendency by technologists and scientists to represent mental life as an information-processing computer?

Just a few years after the dawn of computer technology in the 1940s, the brain was said to operate like a computer, with the role of physical hardware played by the brain itself and our thoughts serving as software. … Propelled by subsequent advances in both computer technology and brain research, an ambitious multidisciplinary effort to understand human intelligence gradually developed, firmly rooted in the idea that humans are, like computers, information processors. This effort now involves thousands of researchers, consumes billions of dollars in funding, and has generated a vast literature consisting of both technical and mainstream articles and books … speculating about the ‘algorithms’ of the brain, how the brain ‘processes data’, and even how it superficially resembles integrated circuits in its structure. The information processing metaphor of human intelligence now dominates human thinking, both on the street and in the sciences.

To what extent, for example, are biological neural networks conflated with (or reduced to) artificial neural networks as findings and insights from computational neuroscience are translated into applied AI and cognitive systems R&D programs? A kind of biosocial enthusiasm about the plasticity of the brain and epigenetic modulation is animating the technological ambitions of Pearson and IBM, one that may be led more by computational understandings of the brain as an adaptive information-processing device than a culturally and socially situated organ. Future research in this direction would need to interrogate the specific forms of neuro knowledge production they draw upon, as well as engage with social scientific insights into how environments really work to shape human embodied experience (and vice versa).

The translation of educational environments into biosocial spaces that are technologically enhanced by new forms of AI, cognitive systems and other neurocomputational applications could have significant effects on teachers and learners right down to biological and neurological levels of life itself. As Luciano Floridi has noted, these are not forms of ‘ultraintelligence’ but ‘ordinary artefacts’ that can outperform us, and that are designed for specific purposes–but could always be made otherwise, for better purposes:

We should make AI human-friendly. It should be used to treat people always as ends, never as mere means…. We should make AI’s stupidity work for human intelligence. … And finally, we should make AI make us more human. The serious risk is that we might misuse our smart technologies, to the detriment of most of humanity.

The glossy imaginaries of AIEd and cognitive systems in education projected by Pearson and IBM reveal a complex intersection of technological and scientific developments–combined with business ambitions and future visions–that require detailed examination as biosocial matters of concern for the future of education.

Posted in Uncategorized | Tagged , , , , , , , , , , | 2 Comments

Brain/code/space

Ben Williamson

In a new article published in Information, Communication & Society I aim to make some sense of how machine learning algorithms and new forms of ‘brain-inspired’ computing are being imagined for use in education. In particular, the article examines IBM’s ‘Smarter Education’ programme, part of its wider ‘Smarter Cities’ agenda, focusing on its learning analytics applications (based on machine learning algorithms) and cognitive computing developments for education (which take inspiration from neuroscience for the design of brain-like neural networks algorithms and neurocomputational devices). Together, these developments constitute the emergence of ‘learning algorithms’ that are responsive, adaptive and appear to possess some degree of sentience and cognition.

The article is part of a forthcoming special issue of the journal on ‘the social power of algorithms’ edited by David Beer, and it’s really great to be in the company of other papers by Daniel Neyland & Norma Mollers, Taina Bucher, Bernard Rieder, and Rob Kitchin. It was Rob Kitchin’s work (which he presented at the first Code Acts in Education seminar in 2014) that originally got me interested in ideas about smart ‘programmable cities’–which I’ve taken up to explore ideas about education in smart cities–and in my article I’ve drawn on the concept of ‘code/space‘ he developed with Martin Dodge. My starting place is that recently urban environments have been reimagined as ‘smart cities of the future’ with the computational capacity to monitor, learn about, and adapt to the people that inhabit them. In other words, smart cities are themselves ‘learning environments.’ What does it mean for urban space to learn? For IBM, the answer lies in neuroscience, and particularly in a synthesis of brain science and computer science innovations–both areas in which it has been significantly active, particularly in relation to the field of cognitive computing. IBM’s imaginary of the future smart city is one in which the environment itself is envisaged as being a ‘cognitive environment’–with schools as one such kind of space, as illustrated by its ideas for a ‘classroom that will learn you.’

In the article I explore the relationship between learning algorithms, neuroscience and the new learning spaces of the city by combining the notion of programmable code/space with ideas about the ‘learning brain’ to suggest that new kinds of ‘brain/code/spaces’ are being developed where the environment itself is imagined to possess brain-like functions of learning and cognition performed by algorithmic processes. I take IBM’s Smarter Education vision as an exemplar of its wider ambitions to make smart cities into highly-coded brainy spaces that are intended to supplement, augment and even optimize human cognition too.

In other words, IBM’s vision for Smarter Education is diagrammatic of its plans for ‘cognitive cities’ that are configured for advanced mental processing–and that rely on neuro-technological renderings of human brain functioning. The learning algorithms of learning analytics and cognitive computing applications imagined by IBM contain particular neuroscientific models of learning processes. Its glossy imaginary of Smarter Education acts as a seemingly desirable model not just for the future of schools in software-enabled urban environments, but as a diagram for future cities that are to be treated as learning environments and enacted by increasingly cognitive forms of computing technology.

The term brain/code/space registers how the learning algorithms of data analytics and cognitive computing are weaving constitutively into the functioning and experience of smart cities, including but not limited to the cognitive classrooms of IBM’s imagined smarter education environments. The brain/code/spaces of IBM’s smart cognitive classrooms are built around models of the brain that are encoded in the functioning of learning algorithms and inserted into the pedagogic space of the classroom. IBM’s imaginary of the brain/code/spaces of such cognitive learning environments is one instantiation of a new kind of urban space in which neuroscientific claims about brain plasticity are built in to the learning algorithms that constitute the functioning and experience of the environment itself. The notion of brain/code/space articulates a novel neurocomputational biopolitics in which brain functions are transcoded into data, and then codified into nonconscious cognitive learning algorithms and applications that are designed to augment human cognition. I suggest that IBM’s imaginary of Smarter Education is a kind of computational neurofuture-in-the-making, one that illustrates how the neuro-technological diagrammatization of the human ‘learning brain’ is being written in to the functioning of smart urban space through the design of learning algorithms.

The full paper, ‘Computing brains: learning algorithms and neurocomputation in the smart city,’ is available open access.

Posted in Uncategorized | Tagged , , , , , , , | Leave a comment

Intimate analytics

Ben Williamson

In April 2016 the Education Endowment Foundation launched the Families of Schools Database, a searchable database that allows any school in England to be compared with statistically-similar institutions. At about the same time, the Learning Analytics and Knowledge 2016 conference was taking place in Edinburgh, focused on the latest technical developments and philosophical and ethical implications of data mining learners and ‘algorithmic accountability.’ The current development of school comparison websites like the Families of Schools Database and the rapid growth of the learning analytics field point to the increasingly fine-grained, detailed and close-up nature of educational data collection, calculation and circulation–that is, they offer a kind of numerical and ‘intimate’ analytics of education.

Intimate datascapes
One way of approaching these school comparison databases and learning analytics platforms is through Kristin Asdal’s notion of ‘accounting intimacy.’ According to Asdal, practices of calculation are increasingly moving away from bureaucratic practices enacted in distant ‘centres of calculation’ to much more ‘intimate’ calculative practices that are enacted in situ, close to the action they measure. Intimacy also implies close relationship, and practices of calculative or accounting intimacy can also be understood in terms of how numbers and numerical presentations of data can be used to build intimate relationships between different actors.

Radhika Gorur has adapted Asdal’s ideas to suggest that more ‘intimate accounting’ is increasingly occurring in education. Drawing on the example of the school comparison site MySchool in Australia, she argues that:

The public, especially parents, was exhorted to make itself familiar–intimate–with the school by studying the wealth of detail about each school that was on My School. The idea was that, armed with intimate knowledge of their child’s school, parents could exert pressure on schools to perform well and get the best outcomes for their children. Not only did My School become a technology through which the government entered intimate spaces of schools, schools themselves entered intimate spaces of living rooms and kitchens through discussions between parents.

Through these techniques, schools could become available for the intimate scrutiny of the government as well as by parents.

The Families of Schools Database, like Australia’s MySchool, involves schools in providing highly intimate details—in the form of numbers—that can then be presented to the general public. These public databases allow the school to be known and discussed, as Gorur argues, in the intimate spaces of the home—as well as involving school leaders in the intimate accounting and disclosure of their institution’s performance according to various criteria. One aim of the Families of Schools Database is to enable statistically-similar schools to identify each other and then collaborate to overcome shared problems. An intimate knowledge of other institutions is required to facilitate such collaboration (thought it might also motivate competition). While school data certainly are collected together and transported to distant centres of calculation to allow the compilation of such databases, a certain demand is placed on institutions to present themselves in terms of an intimate account, and ultimately to share that account as a means towards possible collaboration with their numerical neighbours.

Following Ingmar Lippert we might say that such practices of intimate accounting configure the school environment as a ‘datascape,’  one whose existence in organizational reality is achieved through the calculative practices that make it ‘accountable.’ By configuring the school environment as a ‘dataspace,’ as Lippert argues, ‘reality is enacted’ as its intimate details are projected as a stabilized numerical account. Databases such as the Families of Schools Database might therefore be understood as intimate datascapes, where schools’ data are disclosed with the aim of building close relationships with parents and other institutions, whilst also becoming more visible to government.

Algorithmic intimacy
When it comes to learning analytics, the level of intimate accounting is increased even further. With such systems comes the technological ambition to know the microscopically intimate details of the individual learner. Major learning analytics platform providers such as Knewton claim to collect literally millions of data points about millions of users to amass huge big datasets that can be used for the automatic analysis of learning progress and performance.

For Knewton, the value of big data in education specifically is that it consists of ‘data that reflects cognition’—that is, vast quantities of ‘meaningful data’ recorded during student activity ‘that can be harnessed continuously to power personalized learning for each individual.’ The collection and analysis of this ‘data that reflects cognition’ is a sophisticated technical and methodological accomplishment. As stated in documentation on the Knewton website:

The Knewton platform consolidates data science, statistics, psychometrics, content graphing, machine learning, tagging, and infrastructure in one place in order to enable personalization at massive scale. … Using advanced data science and machine learning, Knewton’s sophisticated technology identifies, on a real-time basis, each student’s strengths, weaknesses, and learning style. In this way, the Knewton platform is able to take the combined data of millions of other students to help each student learn every single concept he or she ever encounters.

The analytics methods behind Knewton include Item-Response Theory, Probabilistic Graphic Models, and Hierarchical Agglomerative Clustering, as well as ‘sophisticated algorithms to recommend the perfect activity for each student, constantly.’

What a learning analytics platform like Knewton appears to promise is a highly intimate and real-time analytics of the very cognition of the individual, mediated through particular technical methods for making the individual known and measurable. Again, as with the Families of Schools Database, it is clear that the data are being collected and transported to distant centres of calculation—namely Knewton’s vast servers—but the speed of this transportation has been accelerated massively as well as being automated. A vast new datascape of cognition–amassed methodologically according to the psychometric assumptions underlying Item-Response Theory et al–is emerging from such calculative practices.

Moreover, because Knewton’s platform is adaptive, it not only collects and analyses student data, but actively adapts to their performance so that each individual experiences a different ‘personalized’ pathway through learning content, as determined by machine learning algorithms. Such algorithms have the capacity to predict students’ probable future progress through predictive analytics processes, and then, in the form of prescriptive analytics, to personalize their access to knowledge through modularized connections that has been deemed appropriate by the algorithm. To give a sense of this, in Knewton’s documentation, it is stated that all content in the platform is:

linked by the Knewton knowledge graph, a cross-disciplinary graph of academic concepts. The knowledge graph takes into account these concepts, defined by sets of content and the relationships between those concepts. Knewton recommendations steer students on personalized and even cross-disciplinary paths on the knowledge graph towards ultimate learning objectives based on both what they know and how they learn.

The Knewton platform’s ‘knowledge graph’ treats knowledge in terms of discrete modules of content that can be linked together to produce differently connected personalized pathways.

In this sense, knowledge is treated in terms of a network of individual nodes with myriad possible lines of connection, and the Knewton platform ‘refines recommendations through network effects that harness the power of all the data collected for all students to optimize learning for each individual student.’ For Knewton, knowledge is nodal like a complex digital network, and constantly being refined as machine learning algorithms learn from observing large numbers of students engaging with it: ‘The more students who use the Knewton platform, the more refined the relationships between content and concepts and the more precise the recommendations delivered through the knowledge graph.’ In other words Knewton is developing new kinds of intimacies between units of content and concepts, as well as identifying recommendations for students that are based on an assessment of the optimal relationship between the individual learner and the individual content item. The Knewton knowledge graph ultimately consists of networked data that reflects content, and data that reflects cognition, and it is constantly analyzing these data to find best fits, clusters, connections and relationships–or numerical intimacies in the datascape of content and cognition.

Real-time intimate action
What I am briefly trying to suggest here is that a kind of automated real-time intimate accounting at the level of the individual is occurring with these learning analytics platforms. Such platforms both govern learners at a distance—through transporting their data for collection and processing via data servers and storage facilities—but also up-close, intimately and immediately, through real-time adaptivity and personalized prescriptive analytics.

Whereas the Families of Schools Database and MySchool involve more intimate accounting among human actors mediated through public databases, however, the intimate action of learning analytics is algorithmic and subject to machine learning processes. The ambition of Knewton, and other learning analytics platform providers, is nothing less than an intimate account of the individual, which can then be analyzed as points in a vast networked datascape of content and cognition of others to ‘optimize learning’–and in this sense it instantiates a distinctive form of real-time intimate action that is targeted at individual improvement at the level of cognition itself.

Kristin Asdal suggests that intimate accounting involves the ways that calculative practices associated with ‘the office’ become implanted in ‘the factory’–that is, bureaucratic practices of distant data collection and calculation are displaced by practices of enumeration that are enacted much more closely to the measurable action. Schools are now increasingly involved in their own practices of institutional intimate accounting and the production of the school environment as a datascape. The proliferation of learning analytics platforms brings intimate accounting into the everyday life and learning of the individual, with algorithms (and the methodologies that underpin them) designed to provide both an intimate account of the individual–as data that reflects cognition–and to undertake intimate action in the shape of prescriptive analytics and automatically personalized learning pathways that might shape the individual as a cognizing subject.

Posted in Uncategorized | Tagged , , , | Leave a comment

Educating Silicon Valley

Ben Williamson

Jurvetson silicon city
Image credit: Steve Jurvetson

A reality TV show in the UK has documented what school life is like in different geographical parts of the country. Starting in 2011 with Educating Essex, the format has since included Educating Yorkshire, Educating the East End, and Educating Cardiff. The idea of exploring educational experiences in different geographical and socio-demographic zones is an interesting one, and, for me, raises questions about educational experiences in other distinctive places and spaces. Some of my recent research has focused on ‘Educating the Smart City,’ for example.

In this piece I explore the idea of ‘Educating Silicon Valley.’ Silicon Valley’s high-tech companies, startups and culture of venture capital are, as Alistair Duff has argued in a new article, ‘the centre of a techno-economic revolution’ that is ‘now spreading outwards across the world, with major societal effects and implications.’ Surprisingly little research has been conducted on the Silicon Valley workers whose labour and learning contributes to this revolution. Here I try to piece together some sense of how education is being organized in Silicon Valley as an initial attempt to answer the question: how are the forms of knowledge, skills, practices and ways of thinking that contribute to a techno-economic revolution taught and learnt? And how does Silicon Valley seek to shape education to reproduce its centrality to the techno-economic revolution?

Investing in education
Silicon Valley has significant interests in education. On one level, its interests simply reflect market opportunities and business plans—education is a big market, and certain Silicon Valley educational technology products like ClassDojo have quickly spread worldwide. Its interests in education are also, though, more political than simply commercial.

In his recent study of the political outlook of Silicon Valley’s technology elite, Greg Ferenstein has identified key features of a ‘Silicon Valley ideology’:

The Silicon Valley ideology thinks about government as an investor rather than as a protector, arguing that the government’s role is to invest in making people as awesome as possible. Silicon Valley wants to make people in general educated and entrepreneurial.

Notably, the Silicon Valley ideology sees education as the solution to major social, political and technological problems. As Ferenstein notes in his e-book The Age of Optimists, many Silicon Valley startup founders ‘believe that the solution to nearly every problem is more innovation, conversation or education,’ and therefore ‘believe in massive investments in education because they see it as a panacea for nearly all problems in society.’ They particularly like performance-based funding systems like charter schools as educational alternatives that can operate free of centralized government regulation and teachers’ unions.

A particular politics therefore underpins Silicon Valley’s approach to education, on which emphasizes the centrality of education to innovation and to the creation of ‘awesome,’ entrepreneurial individuals. The ways in which it seeks to achieve such aims include, surprisingly, homeschooling.

High-tech homeschooling
A recent article in Wired has shown that many Silicon Valley coders, hackers and makers are now choosing to educate their own children at home. It profiles a new breed of homeschoolers—the techie parents who see public or state education as fundamentally broken, and have chosen instead to educate their children themselves. The Silicon Valley homeschooler is not the fundamentalist activist of liberal stereotyping. Instead, the high-tech homeschooler sees makerspaces and hackerspaces as ideal kinds of educational institutions, where children can learn directly through tinkering, hacking, coding and making, rather than through the prescriptive, standardized model of state schooling.

These new Silicon Valley homeschoolers blend the approach of hackerspaces with a much longer lineage of progressivist education that includes such important ‘deschooling’ figures as Ivan Illich and ‘unschoolers’ such as John Holt. The deschooling and unschooling movements fundamentally saw schools as overly constrictive, and advocated instead for learners to engage in more self-directed education in real-life settings and social networks. This is an irresistible invitation for those with Silicon Valley ideology when it comes to rethinking education.

Through the convergence of Silicon Valley politics and progressivist thinking, the new hacker-homeschoolers represent a new breed of neo-unschoolers. As the Wired article explains:

They don’t prefer homeschooling simply because they find most schools too test-obsessed or underfunded or otherwise ineffective. They believe that the very philosophical underpinnings of modern education are flawed. Unschoolers believe that children are natural learners; with a little support, they will explore and experiment and learn about the world in a way that is appropriate to their abilities and interests. Problems arise, the thinking goes, when kids are pushed into an educational model that treats everyone the same—gives them the same lessons and homework, sets the same expectations, and covers the same subjects.

One way of educating Silicon Valley, then, is through high-tech homeschooling and hackerspaces. Of course, not all Silicon Valley workers are educated in this way; homeschooling is one part of an emerging consensus in the valley that state schooling is broken and that alternative practices and institutions are required.

Silicon startup schools
A notable educational development around Silicon Valley is the establishment of new ‘startup schools.’ The prominent example is AltSchool, set up in 2013 by Max Ventilla, a former tech entrepreneur and Google exec, which ‘prepares students for the future through personalized learning experiences within micro-school communities.’ Its stated aim is to ‘help reinvent education from the ground up.’

After establishing in four sites in San Francisco as a ‘collaborative community of micro-schools,’ AltSchool expanded in September 2015 to Brooklyn and Palo Alto, with further plans for new schools in 2016. It has since hired executives from Google and Uber plus other successful Silicon Valley startups.

The AltSchool chief technology officer, formerly the engineer in charge of the Google.com homepage and search results experience, has stated that ‘I am highly motivated to use my decade of Google experience to enable the AltSchool platform to grow and scale.’ Elsewhere on the AltSchool site, the AltSchool ‘platform’ is described as a new ‘central operating system for education,’ a scalable technical infrastructure that can be transported to new sites. Its platform primarily consists of a powerful software aggregation and data analytics tool which:

pulls in assessments from individual student work, projects, and 3rd party standards, forming a comprehensive view of a student’s progress in each area. An educator can quickly see where a student has demonstrated mastery and where they need to improve specific skills.

In support of this system, its website refers to ‘technology-enabled models’ that are disrupting other industries and institutions, such as Uber and Airbnb, and applies these ideals to education. As a tech platform, AltSchool is managed on analytical, technical and scientific lines, albeit laced with the progressivist discourse from which it draws its central philosophy.

Other startup schools include The Primary School, currently being set up by Mark Zuckerberg and Priscilla Chan, and The Kahn Lab School, established by Salman Kahn of the Kahn Academy. The Kahn Lab School (which consciously echoes John Dewey’s experimental Lab School at the University of Chicago) specializes in math, literacy and computer programming—in line with its tech sector roots—but also emphasizes ‘real world’ projects, character development personalized learning, student-centred learning, and a strong commitment to building children’s ‘character’ and ‘wellness’ through, for example, ‘mindfulness’ meditation training. Like AltSchool, though, says Jason Tanz, its ‘touchy feely’ surface of character-centred learning is combined with analytics tools for ‘tracking data about every dimension of a student’s scholastic and social progress.’

Silicon Valley is actively involved in funding and investing in these new models of schooling. The venture philanthropic Silicon Schools Fund, for example, ‘provides seed funding for new blended learning schools that use innovative education models and technology to personalize learning.’ Its vision is:

  • Schools that give each student a highly-personalized education, by combining the best of traditional education with the transformative power of technology
  • Students gaining more control over the path and pace of their learning, creating better schools and better outcomes
  • Software and online courses that provide engaging curriculum, combined with real-time student data, giving teachers the information they need to support each student
  • Teachers developing flexibility to do what they do best — inspire, facilitate conversations, and encourage critical thinking

In 2015, Laurene Powell Jobs (the widow of Steve Jobs) granted a $50million philanthropic donation to a crowdsourced school redesign project. The XQ Super School Project is a competition to redesign the American high school, which it sees as a ‘dangerously broken’ social institution. Like the Silicon Schools Fund, the project is emblematic of Silicon Valley efforts to invest in education through venture philanthropic means and the role of wealthy tech-entrepreneurial individuals in the attempt to ‘fix’ schools. These programs provide a template for school reform that includes ‘transformative’ technology solutions, real-time data monitoring and measurement, and personalized learning supported by online courses.

Startup schools might be seen as alternative shadow schools that challenge the supposed bureaucratic standardization of state education. These schools have mobilized the opportunity presented by US charter schools policies to create new institutions that lie outside of state regulation and control, and are committed to the rigorous scientific monitoring of their performance through techniques of data collection and analysis. Through establishing such schools, Silicon Valley is seeking to create institutions that might be appropriate to the production of the entrepreneurial individuals who will inhabit the next wave of the techno-economic revolution.

Stanford psychology
Many Silicon Valley employees studied at Stanford University, one of the world’s leading research and teaching universities and itself situated in the heart of the valley. Beyond geographical proximity, there has long been a revolving door between Stanford University and Silicon Valley. As Rithika Trikha explains:

It’s not only witnessed, but also notoriously housed, some of the most celebrated innovations in Silicon Valley. … In return, its entrepreneurial alumni offer among the most generous endowments to the university, breaking the record as the first university to add more than $1 billion in a single year. Stanford shares a relationship with Silicon Valley unlike any other university on the planet, chartering a self-perpetuating cycle of innovation.

These tremendous endowments certainly confirm that Silicon Valley founders are committed to massive investment in education and innovation as a way of addressing social problems.

One of the most significant Stanford departments in the education of Silicon Valley workers is the Persuasive Technology Lab. The lab aims to apply persuasive technologies to ‘bring about positive changes in many domains, including health, business, safety, and education,’ and ‘creates insight into how computing products—from websites to mobile phone software—can be designed to change what people believe and what they do.’

According to Jacob Weisberg, some of Silicon Valley’s most successful startup founders and app designers are alumni of the lab. They subscribe to its insights about designing technologies to create ‘habit-forming products’—the title of a book by one of the lab’s key researchers is Hooked: How to build habit-forming products—otherwise known as ‘persistent routines’ or ‘behavioral loops.’

Silicon Valley companies such as Facebook and Instagram, says Weisberg, have mastered the creation of habit forming products by basing their design on insights into human behaviour from behavioural economics and consumer psychology:

Designers can hook users through the application of psychological phenomena such as investment bias—once you’ve put time into personalizing a tool, you’re more likely to use it. … Another tool is rationalization, the feeling that if one is spending a lot of time doing something, it must be valuable.

Through study at the Persuasive Technology Lab, young Silicon Valley designers are educated into the behavioural and psychological tricks of nudging, influencing and persuading people to change their behaviours, in ways which hook users to their products.

Teen technorati
While higher education institutions such as Stanford clearly have a powerful role in educating Silicon Valley, other emerging organizations from within the valley itself are beginning to challenge this status quo—indeed, many Stanford students don’t even finish their degrees, preferring to establish their own startups instead.

The Thiel Fellowship program, established by PayPal founder Peter Thiel, for example, proposes that educational institutions are entirely redundant when it comes to the meaningful education of young technology entrepreneurs. Each year, selected fellows of the program receive:

a grant of $100,000 to focus on their work, their research, and their self-education while outside of university. Fellows are mentored by our community of visionary thinkers, investors, scientists, and entrepreneurs, who provide guidance and business connections that can’t be replicated in any classroom.

Recipients of the fellowships are all aged 22 or under, and all possess highly impressive track records in entrepreneurship and technical innovation. A key demand of the program is that its awarded fellow ‘skip or stop out’ of higher education, or even school, and engage in self-directed technical research. Five years after being established in 2011, the program claims that Thiel Fellows have started over 60 companies that are together worth $1.1billion. Recipients of the fellowship have been profiled in an online video series called Teen Technorati hosted by Wired.com, which looks like a hybrid of The Apprentice and the satirical Silicon Valley series.

The Thiel Fellowship also supports its fellows to approach technology incubator and accelerator programs like Y Combinator. Incubators help new startups to test and validate ideas, while accelerators turn products into scalable businesses, often through direct equity investment, and provide legal, technical and financial services along with mentorship, working space and access to educators, entrepreneurs, business partners and potential investors. These incubator and accelerator programs are themselves educational in the sense that they provide on-the-job mentoring as a kind of apprenticeship into the cultural, technical and economic practices of Silicon Valley.

Self-help Valley
Once any successful teen technorati or Stanford graduates have made it as far as a job in the valley, the learning does not stop. For a start, many of the technical roles in Silicon Valley companies and startups require a formidable amount of learning as new programming languages, software packages and so on have to be mastered. The kind of self-education promoted by the Thiel Fellowship is a way of enculturing young people to these pressures.

With its relentless demands for innovation, Silicon Valley is also a place where individuals are under pressure to innovate on themselves—to make themselves as awesome as possible to paraphrase Greg Ferenstein once more. As a consequence, the self-help industry in Silicon Valley is booming.

Jennifer Kahn has documented the range of emerging self-help courses that have spread around the valley campuses. Many of these training curricula, Kahn argues, are based on insights from the field of behavioural economics, and emphasize how ‘bad mental habits,’ ‘cognitive errors’ and ‘hidden failures’ (such as procrastination, making poor investments, wasting time, fumbling important decisions, and avoiding problems) can be overcome through rationalist self-analysis. Such programs, says Kahn, have generated ‘interest among data-driven tech people and entrepreneurs who see personal development as just another optimization problem.’ Silicon Valley’s self-help programs promise to enable users to be more intellectually dynamic and nimble’ and to ‘fix personal problems.’

Popular Silicon Valley self-help initiatives translate psychological and behavioural economics insights into training curricula that are aimed at personal optimization. These training curricula encourage valley workers to see themselves in rationalist terms as a programming problem—as a pattern of behaviours and rules in a complex system that, if analyzed hard enough, can be tweaked and modified to perform optimally. As Kahn describes it, they view ‘the brain as a kind of second-rate computer, jammed full of old legacy software but possible to reprogram if you can master the code.’

Self-programmable Silicon Valley
Education in Silicon Valley is driven by several key ways of thinking:

  • Distrust of state education, and a belief that state schooling is broken, bureaucratic and philosophically flawed
  • Confidence in the power of reformed education to drive innovation and thus lead to the solution to major social problems
  • Emphasis on real-world problems, hands-on technical experience and practical learning
  • Commitment to measurement and metrics in the assessment and evaluation of the performance of institutions
  • Belief that philanthropy and venture capital investment (and hybrid combinations of philanthrocapital) can provide the means to fix educational institutions
  • Subscription to the idea that humans are sub-optimal computing machines that can be analyzed for their psychological bugs and fixed through training and rational self-analysis

Several intellectual lines of thought can be detected here: the lingering progressivist commitment to experiential learning; the emphasis of behavioural economics on humans’ ‘mental errors’; and the technocratic assumption that problems can be fixed better with technology than government intervention.

The task of educating Silicon Valley is one that involves varied institutions, pedagogic practices and curricula. These diverse practices and resources provide a kind of loose educative network of educational opportunities, or an infrastructure of learning, that are intended to shape the knowledge, skills, cultural practices and ways of thinking of Silicon Valley people. Indeed, Silicon Valley’s current enthusiasm for investing in and reforming education could be understood as a way in which it is seeking to reproduce its own culture and values. It’s creating new institutions and practices to educate and produce awesome and entrepreneurial innovators–like the self-programmable workers influentially described by Manuel Castells:

Self-programmable labour is equipped with the ability to retrain itself, and adapt to new tasks, new processes and new sources of information, as technology, demand, and management speed up their rate of change.

Ultimately, the task of educating Silicon Valley correlates with the reproduction of self-programmable labour that can retrain itself.

Critical educational sociology has long dealt with how the knowledge and culture of powerful social groups are transmitted through educational institutions, and how this process works to reproduce their social and cultural power. Through its infrastructure of high-tech homeschooling, startup schools, higher education partnerships, teen technorati fellowships and rationalist self-help programs, Silicon Valley is educating itself in order to reproduce its powerful centrality in the current techno-economic revolution.

Posted in Uncategorized | Tagged , , , , | Leave a comment