Ben Williamson
The notion that young people should learn to code has exploded from a minority concern into a mainstream agenda in just the last five years. In March 2015, the BBC announced that it is planning to give away a million coding devices to every pupil entering year 7 of secondary schooling. Just a week earlier, Nesta produced an extensive report on the state of ‘digital making’ among young people in the UK. Digital coding and making is clearly an important educational event, integrally connected to the September 2014 introduction of new Computing programmes of study in the National Curriculum. But what kind of ‘culture of coding’ does it promote?
Image credit: Josh Graciano
Making it
Written by Oliver Quinlan, Nesta’s report, entitled Young Digital Makers claims that learning to code is a key part of growing a nation of ‘digital makers,’ individuals with the skills and knowledge to create the digital products of the future, grow the economy, and contribute to societal improvement. It refers to ‘digital making’ as:
distinct from simply using digital devices, and as the best way of understanding how technology works. Our work to date has focused on helping young people to ‘look under the hood’ of technology while they are making. From programming entirely on a computer to designing and 3D printing physical objects, digital making represents a diverse range of activities.
Many of these activities have been promoted in the last two years through Nesta’s own Make Things Do Stuff campaign. But Young Digital Makers signals a mainstreaming of these activities:
A huge expansion is needed if we are to grow a nation of digital creators who can manipulate and build the technology that both society and industry are increasingly reliant on. This expansion cannot be left exclusively to professionals, however, as we simply don’t have enough of them. It will require the mobilisation of enthusiasts and interested amateurs, from parents and non–expert teachers, to those working in the tech industry, working and learning alongside young people to help meet this demand.
The report provides a really useful summary of the state of digital making activities, including learning to code, and campaigns strongly for expansion in this area in order to achieve future social, economic and industry aspirations. It also argues that ‘Schools must exploit their potential as a hub for digital making opportunities, work with informal learning organisations, raise parents’ awareness and recruit volunteers.’
Young Digital Makers features a foreword by Tony Hall, Director-General of the BBC, who uses his prefatory remarks to announce the BBC’s major 2015 campaign ‘Make It Digital.’ Upon its launch on 12 March 2015, the BBC described ‘Make It Digital’ as ‘a major UK-wide initiative to inspire a new generation to get creative with coding, programming and digital technology.’ In addition to a range of broadcast programming (including programmes on CBeebies, and tie-ins with Eastenders and Dr. Who) the BBC is planning to give away a million ‘Micro Bits’ coding devices to every high school student in the UK:
The Micro Bit will be a small, wearable device with an LED display that children can programme in a number of ways. It will be a standalone, entry-level coding device that allows children to pick it up, plug it into a computer and start creating with it immediately. It is designed to be a starting point to get younger children interesting in coding so they can move onto other, more complex devices in future.
The device is clearly designed to resonate with the legacy of the BBC Micro, a matter examined in a previous Nesta report, which states that ‘the BBC Micro was complemented by activities that increased demand for computing generally, by promoting cultural shifts in attitudes towards computing and delivering learning into homes and schools.’
Coding cultures
Both the BBC and Nesta, and particularly the combination of the two collaborating, support the notion that coding and making is part of a cultural shift, one originating at least in part through the BBC Micro in the 1980s and anticipated into the future by the Micro Bit’s launch in 2015. But the kind of culture of coding implied by Nesta and the BBC is one with a longer historical genealogy.
If we want to understand the current interest in learning to code and digital making, it is worth considering how it reflects a set of sharp historical divisions in the disciplines of computing—between computer science, computational science, and software development—that Brian Hayes has traced out in a recent analysis of the ‘Cultures of Code.’ While computer science is concerned with understanding underlying algorithms, software development is concerned with the production of tangible artefacts, and computational science treats the computer not as an object of study but a scientific instrument. And the divisions run deeper than this. The cultures of computer science, computational science and software development are very different; computer scientists, computational scientists and software developers work in different settings, attend different conferences, belong to different professional associations, and have very different ways of working, with different worldviews, systems of thinking, and professional practices. They are distinctly different ‘cultures of code’ as Hayes argues.
These differences are important for situating learning to code, the Micro Bit, digital making and its possibilities. Hayes is optimistic that ‘a new generation discovers that coding is cool’ and about the ‘hacker enthusiasm’ for the ‘nerdy side of life,’ but is cautious about the long-term contribution of learning to code initiatives to the field of computing:
How will the members of this exuberant new cohort distribute themselves over the three continents of computer science, computational science, and software development? What tasks will they put on their agendas? At the moment, most of the energy flows into the culture of software development or programming. The excitement is about applying computational methods, not inventing new ones or investigating their properties. … Everyone wants to pick up the knack of coding, but the more abstract and mathematical concepts at the core of computer science attract a smaller audience.
Learning to code therefore needs to be understood in terms of the longer history of the separation of coding from computer science, and potentially seen as an unhelpfully ‘cool’ deviation from the far less funky business of advancing the future of computing itself. The distinguishing culture of code associated with computer science, and the professional identities held by those who conduct it, is distinct from the culture of code associated with learning to code, and the potential professional identities available for those who pursue coding further.
This in turn leads to another point of history. As Nathan Ensmenger has shown in his history of computer programming, The Computer Boys Take Over, what is meant by ‘coding’ is itself a historical artefact. When computing first emerged as a professional practice in the 1940s and 50s, a ‘coder’ was seen merely as a ‘glorified clerical worker’ and the task of coding was almost exclusively performed by women, who were expected to:
code into machine language the higher-level mathematics developed by male scientists and engineers. Coding implied manual labor, and mechanical translation or rote transcription; coders were obviously low on the intellectual and professional hierarchy.
The actual art of ‘programming,’ as it came to be known the late 1940s, consisted of six steps (including mathematical conceptualization, algorithm selection, and numerical analysis) only the last of which was the ‘coding’ done by female coders.
These historical accounts should alert us to the longer lineages from which learning to code, digital making, and coding devices for children have been formed. It’s important to be cognizant that the ‘culture of code’ associated with ‘learning to code’ may lie rather closer to the culture of software development than computer science. The researcher of coding cultures Adrian Mackenzie, for example, talks of the ‘urbanization of code’ where many contemporary coding practices bear the imprint, goals, values and assumptions of urban Silicon Valley cultures, rather than the imprint of disciplinary Computer Science cultures. The Californian culture of coding is entrepreneurial, shaped by business plans, and centred on ‘startups’ and ‘technical solutionism‘ rather than disciplinary or theoretical advancement. Where code is produced, by which people, and according to which ways of thinking, are important questions for any study of coding cultures. It may be rather more complex than the glorified clerical work of the coding girls of the 40s and 50s (though in actual fact coding then proved much less easy to do than its scientific planners imagined), but learning to code might also be much closer as a set of practices and ways of thinking to commercial coding than to computer science.
None of this is to discount the potential educational significance of the Micro Bit, learning to code clubs, or other forms of digital making. But it is intended to put such innovations in their necessary historical context so as not to take them for granted, and to begin tracing the assumptions on which they’ve been built, and the longer lines of thinking that have made them seem like the correct solution to contemporary problems. As the culture of code associated with learning to code and digital making now enters school through the computing curriculum, supported by the massive digital and broadcast infrastructure of the BBC, we need to be aware of how those assumptions, values and practices are now being promoted in schools, and inserted into the everyday practices and ways of thinking of young people themselves. It will shape their ways of engaging with computers, and in turn, will influence the ways they go about shaping computers towards particular goals in the future.