We are all guilty of relying too heavily on personal experience – and not evidence – when it comes to views on how society should work. We tend to assume our own experience is the measure of how people should behave. My treatment in hospital may have been excellent…but this is not necessarily how things really are. We have only to read the most recent Care Quality Commission report to know differently.
It’s sometimes the same with policy-makers. Like the rest of us, they can be persuaded by personal experience and what they see and they may over-rely on anecdotal evidence. How often, to prove the success of their policies, have we heard reference to ‘witnesses’ – a family I’ve just spent the day with, a factory I toured – rather than research evidence and the testimony of numbers as to what does and what does not work.
Certainly, evidence can be counter-intuitive and evidence-informed policy making does not always come naturally to those in positions of political leadership. Research needs to be commissioned, you need to wait for its findings. Outcomes may not support a policy a minister is very keen to set in motion (no minister wants to be remembered for doing nothing!). Politicians need to think about where to get their evidence from. From special advisers and experts, from lobbyists or think tanks, the media, members of their constituency or academic research? What about the influence of ‘gurus’ or studies undertaken by sole individuals which are very popular and taken up in a big way by the media?.
Then there’s work to do to get policy makers and researchers to understand each others’ perspectives on research evidence. Policy makers see research as based on common sense, contextual, policy relevant, timely, clearly messaged, jargon free, short, concise and accessible. Researchers see it as scientific (context free), empirically proven, theoretically driven, needing as long as it takes (i.e. not just timely), with caveats and qualifications where necessary, using language specific to the discipline and detailed, comprehensive and methodological.
We need a culture that values knowledge and information, that recognises these differences and builds the capacity of policy makers and researchers. That takes into account the many aspects of policy making and understands its pressures – political, economic and timescale driven pressures. Also that takes into account and addresses the influence of the media and its responsibility for accuracy in communicating research to the wider public.
Statistics – through quantitative research – has an important role to play in challenging existing assumptions that x or y policy will have or has made a difference. Researchers often find it easiest to engage ministers with research as policy is being developed and less keen to continue to engage with it, whereas the real effects of policy usually only emerge years later - in education, as long as 6+ years later – by which time the support for any real statistical analysis may have waned (especially if the research finds that the policy has not had the intended effect.)
And then, many problems are complex and resistant and we need statistical and qualitative data to address them…policy makers, researchers and research mediators working together through the entire policy development, monitoring and evaluation process: designing research, generating research questions, verifying findings etc. There are some great outfits already doing this, but we still need more…if you are interested in reading more about education policy making, turn to our report on Past Marks, a recent seminar in parliament.