Skip to content
Skip to main content

About this free course

Become an OU student

Download this course

Share this free course

Achieving public dialogue
Achieving public dialogue

Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.

6.3 Some issues for consideration

DEMOCS offer a novel, and perhaps unique approach to public participation on contentious science issues. But how far is the process capable of dealing with the difficulties and uncertainties raised in the examples of engagement processes already considered in this course, and what benefits might it bring? For example:

  1. How far is this process of group discussion likely to lead to outcomes that are representative of ‘public’ opinion?

  2. Is the process designed to ensure ‘balance’ in the dialogue?

  3. Does the process address the ‘whole’ picture, by, for example, explicitly addressing social, ethical and economic as well as scientific aspects of the topic under consideration?

  4. What steps will be taken to ensure that the outcomes of the process are both valued and used to maximal effect, e.g. within policy making?

Bearing these (and most likely more!) points in mind, a number of more specific questions emerge, which are explored in the following activities. Though they arise from the DEMOCS activity, they could be asked about any dialogue initiative.

Activity 4

DEMOCS may be used in a range of different contexts, only some of which are linked with science policy making. Do you think that an approach such as this can have value even if not linked to policy making? If so, what kinds of benefits might it bring?

Answer

Even without a direct link to policy, it is possible that the games might help people to feel more informed and empowered within the democratic process (see concluding remarks in Reading 5), and therefore better able to evaluate competing claims made in the media and by politicians in relation to contentious areas of science. Indeed, feedback obtained during the early development phase of DEMOCS (Walker and Higginson, 2003) and a later independent assessment carried out by University College London (PUEC Group, University College London, 2004) suggests that the games do provide ‘a way into’ new and complex topics, ‘while at the same time discouraging over-simple or dogmatic conclusions’. The PUEC evaluation suggests that the group discussion process helps in developing relevant, transferable personal skills related to decision making, such as constructive discussion, negotiation and consensus building. Set against this, there are also some concerns that the game process itself is complex and that this might on occasion divert from the issues at hand and limit what can be achieved in the time available. Moreover, although nef collates and publishes the results of games that are reported back, DEMOCS will need to reach a much larger audience if a more ambitious goal of generating wider public interest and involvement in developing science policy is to be met.

Activity 5

Given that the conversations prompted by DEMOCS are both informed and driven by materials provided in the kit, should this be considered a ‘top down’ or a ‘bottom up’ approach (see Reading 5 again)? Here you might like to consider who should, and in practice actually does, set the agenda – those producing the kit, or the people participating in the process?

Answer

nef is particularly concerned to ensure that the games offer a truly ‘bottom-up’, citizen-led approach, takes great care to try to ensure ‘balance’ in the materials provided, and is keen to respond to feedback that suggests otherwise. Using cards, rather than having ‘experts’ present their perspectives face-to-face, might help to maintain that ‘neutrality’. However, keeping steady feet on the balancing beam is not a straightforward matter and, as you'll have noted, like all engagement processes, it is still possible to bias proceedings inadvertently.

One option is that both the game-makers and the game-participants should set the agenda – which is indeed what has happened in practice. The process has to include sufficient briefing (factual background and identification of the issues) to enable participants to get to grips with often complex issues, whilst at the same time providing sufficient flexibility to enable participants to go beyond the material provided in the cards and make the discussions ‘their own’ – exploring their own responses. This takes time, and requires careful management. It would be fascinating to compare participants’ responses in games played with, and without, facilitators and also to compare DEMOCS with the wide-scale, self-facilitated discussions initiated and supported by CoRWM.

Activity 6

In light of the difficulties of avoiding bias, you might like to:

  1. Think about how you would go about developing a set of information and issues cards for a DEMOCS game, in order to capture the essence of the topic in a balanced, jargon-free manner, picking up on the key points of interest for (lay) citizens.

  2. Consider possible means of ensuring that ‘deliberation’ (rather than polarised ‘debate’) takes place during the DEMOCS game, so avoiding the prospect of hijacking by those with particularly strong views. In this context, DEMOCS provides ‘conversation guidelines’ aimed at generating a supportive environment for discussion. (It's helpful to reflect on what deliberation means in this context; a standard definition talks of ‘careful consideration with a view to decision; the consideration and discussion for and against a measure’.)

Finally, imagine that you have been asked to evaluate the success of DEMOCS as an engagement tool. Let's explore the measures and methods that might prove useful. Note that it will be important to examine both the quality of the process itself and the value of its outcomes.

Establishing criteria for evaluation is far from easy! Section 3.3 of Open Channels, a review by the Parliamentary Office of Science and Technology (POST) of developments in public dialogue published in 2001, lists possible criteria for evaluating the quality and impact of dialogue. It would be helpful to read through that list, seeing how the points match up to your thoughts on evaluating DEMOCS, and to make a note of any difficulties or questions that you think could arise in carrying out such an evaluation in practice. A more extensive and highly relevant discussion of difficulties in evaluating public engagement initiatives, with special reference to GM Nation?, is provided by Rowe et al. (2005).