Evidence is a problematic concept when thinking about digital technology in education because:

  • is it very difficult to show a clear causal relationship between a single variable (such as the introduction of phones) and learning outcomes – all the more so with unstoppable change all around too.
  • digital technology has changed the nature of disciplines outside school and thus should also impact on the curriculum inside school, and it provides new strategies for supporting learners, however the existing metrics in education (including high stakes summative testing and QA frameworks e.g. the Ofsted Framework) have not been changed enough to reflect this, and thus fail to capture new impacts that digital technology has had on practice and learning outcomes. An example would be the new strategies for collaboration (in social media perhaps) and the new weight placed on remote collaboration within the workplace.
  • the effectiveness of digital technology to enhance learning is dependent upon how it is used, and very subtle differences in the way it is implemented can have large impacts, thus it is difficult to make valid generalisations between different implementations/contexts

Arguably, a more productive approach to enhancing the use of digital technology in education would be to adopt an engineering mind set, rather than a more traditional scientific one. Thus, rather than trying to disprove hypotheses, the focus should be on building upon the success of specific solutions to refine them.

It is often suggested that NASA would never have got an astronaut on the moon if they had adopted a scientific approach, because they would have quickly found evidence that it wasn’t possible. That is not to say that one shouldn’t build upon sound scientific evidence -as for example we do with the new revelations from cognitive science.

Of course a great deal is known from research on the pioneering UK implementation of digital technology in schools that started in the early 1980s. Indeed, in 2006, a review of the implementation of part of the Harnessing Technology strategy highlighted that the problems being encountered then were the same problems that had been identified by research over the previous 30 years, which had not yet been addressed (Twining et al., 2006).

At the heart of this is a debate about learning outcomes – which are themselves complex and multi-variate. What we are saying though is simply this: try something new that has already worked elsewhere and where you have seen and trusted the outcomes. Tailor it to your context and feed back your own conclusions as they emerge.